id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
isashap/resumeconcat | 2023-08-30T18:12:00.000Z | [
"region:us"
] | isashap | null | null | null | 0 | 0 | |
open-llm-leaderboard/details_DataLinguistic__DataLinguistic-34B-V1.0 | 2023-08-30T18:13:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of DataLinguistic/DataLinguistic-34B-V1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DataLinguistic/DataLinguistic-34B-V1.0](https://huggingface.co/DataLinguistic/DataLinguistic-34B-V1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DataLinguistic__DataLinguistic-34B-V1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T18:11:50.917227](https://huggingface.co/datasets/open-llm-leaderboard/details_DataLinguistic__DataLinguistic-34B-V1.0/blob/main/results_2023-08-30T18%3A11%3A50.917227.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23205864259563588,\n\
\ \"acc_stderr\": 0.030715936843036345,\n \"acc_norm\": 0.2336046598801071,\n\
\ \"acc_norm_stderr\": 0.030731498766140584,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662578,\n \"mc2\": 0.48734329486879213,\n\
\ \"mc2_stderr\": 0.01631194781446388\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23293515358361774,\n \"acc_stderr\": 0.012352507042617396,\n\
\ \"acc_norm\": 0.2764505119453925,\n \"acc_norm_stderr\": 0.013069662474252428\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2819159529974109,\n\
\ \"acc_stderr\": 0.004490130691020429,\n \"acc_norm\": 0.3296156144194384,\n\
\ \"acc_norm_stderr\": 0.004691128722535484\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662578,\n\
\ \"mc2\": 0.48734329486879213,\n \"mc2_stderr\": 0.01631194781446388\n\
\ }\n}\n```"
repo_url: https://huggingface.co/DataLinguistic/DataLinguistic-34B-V1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|arc:challenge|25_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hellaswag|10_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:11:50.917227.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:11:50.917227.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T18:11:50.917227.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T18:11:50.917227.parquet'
- config_name: results
data_files:
- split: 2023_08_30T18_11_50.917227
path:
- results_2023-08-30T18:11:50.917227.parquet
- split: latest
path:
- results_2023-08-30T18:11:50.917227.parquet
---
# Dataset Card for Evaluation run of DataLinguistic/DataLinguistic-34B-V1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/DataLinguistic/DataLinguistic-34B-V1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [DataLinguistic/DataLinguistic-34B-V1.0](https://huggingface.co/DataLinguistic/DataLinguistic-34B-V1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DataLinguistic__DataLinguistic-34B-V1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T18:11:50.917227](https://huggingface.co/datasets/open-llm-leaderboard/details_DataLinguistic__DataLinguistic-34B-V1.0/blob/main/results_2023-08-30T18%3A11%3A50.917227.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23205864259563588,
"acc_stderr": 0.030715936843036345,
"acc_norm": 0.2336046598801071,
"acc_norm_stderr": 0.030731498766140584,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662578,
"mc2": 0.48734329486879213,
"mc2_stderr": 0.01631194781446388
},
"harness|arc:challenge|25": {
"acc": 0.23293515358361774,
"acc_stderr": 0.012352507042617396,
"acc_norm": 0.2764505119453925,
"acc_norm_stderr": 0.013069662474252428
},
"harness|hellaswag|10": {
"acc": 0.2819159529974109,
"acc_stderr": 0.004490130691020429,
"acc_norm": 0.3296156144194384,
"acc_norm_stderr": 0.004691128722535484
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662578,
"mc2": 0.48734329486879213,
"mc2_stderr": 0.01631194781446388
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
isashap/resumenew | 2023-08-31T20:40:48.000Z | [
"language:en",
"region:us"
] | isashap | null | null | null | 0 | 0 | ---
language:
- en
--- |
Vierza/dido_OnlyRealv30_V1 | 2023-08-30T18:34:47.000Z | [
"region:us"
] | Vierza | null | null | null | 0 | 0 | Entry not found |
isashap/pleasework | 2023-08-30T20:16:04.000Z | [
"language:en",
"region:us"
] | isashap | null | null | null | 0 | 0 | ---
language:
- en
pretty_name: AI Resume
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
KeiTaaRo/aivoice | 2023-08-30T18:43:23.000Z | [
"region:us"
] | KeiTaaRo | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__Airoboros-L2-13B-2.1-GPTQ | 2023-09-22T16:15:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Airoboros-L2-13B-2.1-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Airoboros-L2-13B-2.1-GPTQ](https://huggingface.co/TheBloke/Airoboros-L2-13B-2.1-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Airoboros-L2-13B-2.1-GPTQ\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T16:15:21.953879](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Airoboros-L2-13B-2.1-GPTQ/blob/main/results_2023-09-22T16-15-21.953879.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.36325503355704697,\n\
\ \"em_stderr\": 0.004925249459609538,\n \"f1\": 0.4313957634228207,\n\
\ \"f1_stderr\": 0.004768180025704384,\n \"acc\": 0.4016912073136653,\n\
\ \"acc_stderr\": 0.00940489808002435\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.36325503355704697,\n \"em_stderr\": 0.004925249459609538,\n\
\ \"f1\": 0.4313957634228207,\n \"f1_stderr\": 0.004768180025704384\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05989385898407885,\n \
\ \"acc_stderr\": 0.006536148151288716\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759982\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Airoboros-L2-13B-2.1-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|arc:challenge|25_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T16_15_21.953879
path:
- '**/details_harness|drop|3_2023-09-22T16-15-21.953879.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T16-15-21.953879.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T16_15_21.953879
path:
- '**/details_harness|gsm8k|5_2023-09-22T16-15-21.953879.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T16-15-21.953879.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hellaswag|10_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:43:07.011974.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T18:43:07.011974.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T18:43:07.011974.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T16_15_21.953879
path:
- '**/details_harness|winogrande|5_2023-09-22T16-15-21.953879.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T16-15-21.953879.parquet'
- config_name: results
data_files:
- split: 2023_08_30T18_43_07.011974
path:
- results_2023-08-30T18:43:07.011974.parquet
- split: 2023_09_22T16_15_21.953879
path:
- results_2023-09-22T16-15-21.953879.parquet
- split: latest
path:
- results_2023-09-22T16-15-21.953879.parquet
---
# Dataset Card for Evaluation run of TheBloke/Airoboros-L2-13B-2.1-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Airoboros-L2-13B-2.1-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Airoboros-L2-13B-2.1-GPTQ](https://huggingface.co/TheBloke/Airoboros-L2-13B-2.1-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Airoboros-L2-13B-2.1-GPTQ",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T16:15:21.953879](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Airoboros-L2-13B-2.1-GPTQ/blob/main/results_2023-09-22T16-15-21.953879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.36325503355704697,
"em_stderr": 0.004925249459609538,
"f1": 0.4313957634228207,
"f1_stderr": 0.004768180025704384,
"acc": 0.4016912073136653,
"acc_stderr": 0.00940489808002435
},
"harness|drop|3": {
"em": 0.36325503355704697,
"em_stderr": 0.004925249459609538,
"f1": 0.4313957634228207,
"f1_stderr": 0.004768180025704384
},
"harness|gsm8k|5": {
"acc": 0.05989385898407885,
"acc_stderr": 0.006536148151288716
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759982
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
loubnabnl/humaneval_plus | 2023-08-30T20:10:39.000Z | [
"region:us"
] | loubnabnl | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: task_id
dtype: string
- name: prompt
dtype: string
- name: entry_point
dtype: string
- name: canonical_solution
dtype: string
- name: test
dtype: string
- name: contract
dtype: string
- name: base_input
dtype: string
- name: atol
dtype: float64
- name: plus_input
dtype: string
splits:
- name: train
num_bytes: 7571857
num_examples: 164
download_size: 2006302
dataset_size: 7571857
---
# Dataset Card for "humaneval_plus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Musha-the-Yusha/mushi-snli-llama2-grammar_struct-10k | 2023-08-31T10:25:29.000Z | [
"region:us"
] | Musha-the-Yusha | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3456483
num_examples: 10000
download_size: 1106306
dataset_size: 3456483
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mushi-snli-llama2-grammar_struct-10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaenakiakona/hawaiian | 2023-09-03T03:58:03.000Z | [
"region:us"
] | kaenakiakona | null | null | null | 0 | 0 | Entry not found |
pnadel/pri_docidv2 | 2023-08-30T19:51:39.000Z | [
"region:us"
] | pnadel | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Handwritten
'1': Publication
'2': Tabular
'3': Typeset document
'4': Unusable
splits:
- name: train
num_bytes: 21654067.34670487
num_examples: 488
- name: test
num_bytes: 9318348.65329513
num_examples: 210
download_size: 30987466
dataset_size: 30972416.0
---
# Dataset Card for "pri_docidv2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alx-ai/arxiv-csv-2001-001 | 2023-08-30T19:52:22.000Z | [
"region:us"
] | alx-ai | null | null | null | 0 | 0 | Entry not found |
CommunistCowGod/kizn | 2023-08-30T19:58:31.000Z | [
"region:us"
] | CommunistCowGod | null | null | null | 0 | 0 | Entry not found |
Neu256/Arc-diffusion-1.2.1 | 2023-08-30T20:27:02.000Z | [
"region:us"
] | Neu256 | null | null | null | 0 | 0 | Image + Text! |
andrewsiah/rlhf | 2023-09-02T15:27:38.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | andrewsiah | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: instruction
dtype: 'null'
- name: input
dtype: string
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
splits:
- name: train
num_bytes: 10956107
num_examples: 8531
download_size: 6514579
dataset_size: 10956107
---
|
yzhuang/autotree_automl_credit_gosdt_l256_d3_sd0 | 2023-08-30T20:34:00.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 3388000000
num_examples: 100000
- name: validation
num_bytes: 338800000
num_examples: 10000
download_size: 840908403
dataset_size: 3726800000
---
# Dataset Card for "autotree_automl_credit_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JihyukKim/eli5-subquestion-paired | 2023-08-30T20:38:49.000Z | [
"region:us"
] | JihyukKim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
- name: gold_claims
sequence: string
- name: response_j_claims
sequence: string
- name: response_k_claims
sequence: string
splits:
- name: train
num_bytes: 52398773
num_examples: 43725
- name: test
num_bytes: 981221
num_examples: 831
download_size: 8716749
dataset_size: 53379994
---
# Dataset Card for "eli5-subquestion-paired"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
oriyonay/AQC-v1-category-only | 2023-08-30T20:39:04.000Z | [
"license:other",
"region:us"
] | oriyonay | null | null | null | 0 | 0 | ---
license: other
---
|
ireallyneedajob/test1 | 2023-08-30T20:40:36.000Z | [
"region:us"
] | ireallyneedajob | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored-v2 | 2023-08-30T20:47:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Fredithefish/Guanaco-3B-Uncensored-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/Guanaco-3B-Uncensored-v2](https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored-v2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T20:46:40.695179](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored-v2/blob/main/results_2023-08-30T20%3A46%3A40.695179.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26785174321367716,\n\
\ \"acc_stderr\": 0.03203977857469832,\n \"acc_norm\": 0.2713448583279182,\n\
\ \"acc_norm_stderr\": 0.03203801683897384,\n \"mc1\": 0.20807833537331702,\n\
\ \"mc1_stderr\": 0.01421050347357662,\n \"mc2\": 0.3520822015070296,\n\
\ \"mc2_stderr\": 0.013644875371279687\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3890784982935154,\n \"acc_stderr\": 0.014247309976045605,\n\
\ \"acc_norm\": 0.42150170648464164,\n \"acc_norm_stderr\": 0.014430197069326021\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49352718581955785,\n\
\ \"acc_stderr\": 0.004989363276955168,\n \"acc_norm\": 0.6671977693686517,\n\
\ \"acc_norm_stderr\": 0.004702533775930289\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670713,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.034140140070440354,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.034140140070440354\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2127659574468085,\n \"acc_stderr\": 0.026754391348039776,\n\
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.026754391348039776\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.03664666337225256,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.03664666337225256\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n\
\ \"acc_stderr\": 0.024993053397764826,\n \"acc_norm\": 0.26129032258064516,\n\
\ \"acc_norm_stderr\": 0.024993053397764826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114468,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3686868686868687,\n \"acc_stderr\": 0.034373055019806184,\n \"\
acc_norm\": 0.3686868686868687,\n \"acc_norm_stderr\": 0.034373055019806184\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791515,\n\
\ \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791515\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.02684151432295894,\n \
\ \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.02684151432295894\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26972477064220185,\n \"acc_stderr\": 0.01902848671111545,\n \"\
acc_norm\": 0.26972477064220185,\n \"acc_norm_stderr\": 0.01902848671111545\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859683,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859683\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695077,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695077\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.19730941704035873,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.19730941704035873,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847834,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847834\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.36363636363636365,\n \"acc_stderr\": 0.04391326286724071,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.04391326286724071\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.03642914578292404,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.03642914578292404\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.04541609446503946,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.04541609446503946\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749465,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749465\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2260536398467433,\n\
\ \"acc_stderr\": 0.014957458504335837,\n \"acc_norm\": 0.2260536398467433,\n\
\ \"acc_norm_stderr\": 0.014957458504335837\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n\
\ \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.2958199356913183,\n\
\ \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27444589308996087,\n\
\ \"acc_stderr\": 0.011397043163078154,\n \"acc_norm\": 0.27444589308996087,\n\
\ \"acc_norm_stderr\": 0.011397043163078154\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2938775510204082,\n \"acc_stderr\": 0.029162738410249765,\n\
\ \"acc_norm\": 0.2938775510204082,\n \"acc_norm_stderr\": 0.029162738410249765\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.035716092300534796,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.035716092300534796\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20807833537331702,\n\
\ \"mc1_stderr\": 0.01421050347357662,\n \"mc2\": 0.3520822015070296,\n\
\ \"mc2_stderr\": 0.013644875371279687\n }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|arc:challenge|25_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hellaswag|10_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T20:46:40.695179.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T20:46:40.695179.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T20:46:40.695179.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T20:46:40.695179.parquet'
- config_name: results
data_files:
- split: 2023_08_30T20_46_40.695179
path:
- results_2023-08-30T20:46:40.695179.parquet
- split: latest
path:
- results_2023-08-30T20:46:40.695179.parquet
---
# Dataset Card for Evaluation run of Fredithefish/Guanaco-3B-Uncensored-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/Guanaco-3B-Uncensored-v2](https://huggingface.co/Fredithefish/Guanaco-3B-Uncensored-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored-v2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T20:46:40.695179](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-3B-Uncensored-v2/blob/main/results_2023-08-30T20%3A46%3A40.695179.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26785174321367716,
"acc_stderr": 0.03203977857469832,
"acc_norm": 0.2713448583279182,
"acc_norm_stderr": 0.03203801683897384,
"mc1": 0.20807833537331702,
"mc1_stderr": 0.01421050347357662,
"mc2": 0.3520822015070296,
"mc2_stderr": 0.013644875371279687
},
"harness|arc:challenge|25": {
"acc": 0.3890784982935154,
"acc_stderr": 0.014247309976045605,
"acc_norm": 0.42150170648464164,
"acc_norm_stderr": 0.014430197069326021
},
"harness|hellaswag|10": {
"acc": 0.49352718581955785,
"acc_stderr": 0.004989363276955168,
"acc_norm": 0.6671977693686517,
"acc_norm_stderr": 0.004702533775930289
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.034140140070440354,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.034140140070440354
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.026754391348039776,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.026754391348039776
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.03664666337225256,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.03664666337225256
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764826,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114468,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3686868686868687,
"acc_stderr": 0.034373055019806184,
"acc_norm": 0.3686868686868687,
"acc_norm_stderr": 0.034373055019806184
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791515,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791515
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26972477064220185,
"acc_stderr": 0.01902848671111545,
"acc_norm": 0.26972477064220185,
"acc_norm_stderr": 0.01902848671111545
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859683,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859683
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695077,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.19730941704035873,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.19730941704035873,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.04391326286724071,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.04391326286724071
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.03642914578292404,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.03642914578292404
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.04541609446503946,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.04541609446503946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749465,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749465
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2260536398467433,
"acc_stderr": 0.014957458504335837,
"acc_norm": 0.2260536398467433,
"acc_norm_stderr": 0.014957458504335837
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27444589308996087,
"acc_stderr": 0.011397043163078154,
"acc_norm": 0.27444589308996087,
"acc_norm_stderr": 0.011397043163078154
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2938775510204082,
"acc_stderr": 0.029162738410249765,
"acc_norm": 0.2938775510204082,
"acc_norm_stderr": 0.029162738410249765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.035716092300534796,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.035716092300534796
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.20807833537331702,
"mc1_stderr": 0.01421050347357662,
"mc2": 0.3520822015070296,
"mc2_stderr": 0.013644875371279687
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
JihyukKim/eli5-subquestion-paired-sft | 2023-08-30T20:48:45.000Z | [
"region:us"
] | JihyukKim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: response_j
dtype: string
- name: response_k
dtype: string
- name: gold_claims
sequence: string
- name: response_j_claims
sequence: string
- name: response_k_claims
sequence: string
splits:
- name: train
num_bytes: 20711053
num_examples: 17578
- name: test
num_bytes: 399389
num_examples: 340
download_size: 6461297
dataset_size: 21110442
---
# Dataset Card for "eli5-subquestion-paired-sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_bank-marketing_gosdt_l256_d3_sd0 | 2023-08-30T20:57:26.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2773600000
num_examples: 100000
- name: validation
num_bytes: 277360000
num_examples: 10000
download_size: 412140145
dataset_size: 3050960000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Samee-ur/tinystories_instruction_finetuning_with_pretrainingdata | 2023-08-30T21:26:59.000Z | [
"region:us"
] | Samee-ur | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 27650
num_examples: 206
download_size: 14275
dataset_size: 27650
---
# Dataset Card for "tinystories_instruction_finetuning_with_pretrainingdata"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
musli/vg | 2023-08-30T21:35:00.000Z | [
"region:us"
] | musli | null | null | null | 0 | 0 | Entry not found |
CommunistCowGod/tengoku | 2023-08-30T21:49:45.000Z | [
"region:us"
] | CommunistCowGod | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.1 | 2023-08-30T21:49:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-7b-2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-7b-2.1](https://huggingface.co/jondurbin/airoboros-l2-7b-2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T21:48:31.608881](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.1/blob/main/results_2023-08-30T21%3A48%3A31.608881.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.44818395354859086,\n\
\ \"acc_stderr\": 0.035127835633127005,\n \"acc_norm\": 0.4519953391460999,\n\
\ \"acc_norm_stderr\": 0.03511317110865538,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.43954718122913994,\n\
\ \"mc2_stderr\": 0.014710608089324633\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5110921501706485,\n \"acc_stderr\": 0.01460779491401305,\n\
\ \"acc_norm\": 0.5443686006825939,\n \"acc_norm_stderr\": 0.014553749939306864\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5952001593308106,\n\
\ \"acc_stderr\": 0.004898501014225839,\n \"acc_norm\": 0.7867954590718981,\n\
\ \"acc_norm_stderr\": 0.004087339045106304\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4528301886792453,\n \"acc_stderr\": 0.03063562795796182,\n\
\ \"acc_norm\": 0.4528301886792453,\n \"acc_norm_stderr\": 0.03063562795796182\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\"\
: 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3815028901734104,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.3815028901734104,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.0379328118530781,\n\
\ \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.0379328118530781\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.0236369759961018,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.0236369759961018\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4838709677419355,\n\
\ \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.4838709677419355,\n\
\ \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.033661244890514495,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.033661244890514495\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6303030303030303,\n \"acc_stderr\": 0.03769430314512567,\n\
\ \"acc_norm\": 0.6303030303030303,\n \"acc_norm_stderr\": 0.03769430314512567\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5656565656565656,\n \"acc_stderr\": 0.035315058793591834,\n \"\
acc_norm\": 0.5656565656565656,\n \"acc_norm_stderr\": 0.035315058793591834\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3923076923076923,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.3923076923076923,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145658,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145658\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40336134453781514,\n \"acc_stderr\": 0.031866081214088314,\n\
\ \"acc_norm\": 0.40336134453781514,\n \"acc_norm_stderr\": 0.031866081214088314\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.021004201260420075,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.021004201260420075\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.0291575221846056,\n\
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.0291575221846056\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239171,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239171\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6371308016877637,\n \"acc_stderr\": 0.03129920825530213,\n \
\ \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.03129920825530213\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.46564885496183206,\n \"acc_stderr\": 0.04374928560599738,\n\
\ \"acc_norm\": 0.46564885496183206,\n \"acc_norm_stderr\": 0.04374928560599738\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n\
\ \"acc_stderr\": 0.031075028526507748,\n \"acc_norm\": 0.6581196581196581,\n\
\ \"acc_norm_stderr\": 0.031075028526507748\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6079182630906769,\n\
\ \"acc_stderr\": 0.01745852405014764,\n \"acc_norm\": 0.6079182630906769,\n\
\ \"acc_norm_stderr\": 0.01745852405014764\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.02691189868637792,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.02691189868637792\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4673202614379085,\n \"acc_stderr\": 0.028568699752225868,\n\
\ \"acc_norm\": 0.4673202614379085,\n \"acc_norm_stderr\": 0.028568699752225868\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n\
\ \"acc_stderr\": 0.02823776942208533,\n \"acc_norm\": 0.5530546623794212,\n\
\ \"acc_norm_stderr\": 0.02823776942208533\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4660493827160494,\n \"acc_stderr\": 0.02775653525734767,\n\
\ \"acc_norm\": 0.4660493827160494,\n \"acc_norm_stderr\": 0.02775653525734767\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.02914454478159615,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.02914454478159615\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3474576271186441,\n\
\ \"acc_stderr\": 0.0121614177297498,\n \"acc_norm\": 0.3474576271186441,\n\
\ \"acc_norm_stderr\": 0.0121614177297498\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4297385620915033,\n \"acc_stderr\": 0.020027122784928558,\n \
\ \"acc_norm\": 0.4297385620915033,\n \"acc_norm_stderr\": 0.020027122784928558\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49387755102040815,\n \"acc_stderr\": 0.03200682020163908,\n\
\ \"acc_norm\": 0.49387755102040815,\n \"acc_norm_stderr\": 0.03200682020163908\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5621890547263682,\n\
\ \"acc_stderr\": 0.035080801121998406,\n \"acc_norm\": 0.5621890547263682,\n\
\ \"acc_norm_stderr\": 0.035080801121998406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6491228070175439,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.6491228070175439,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.43954718122913994,\n\
\ \"mc2_stderr\": 0.014710608089324633\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-7b-2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|arc:challenge|25_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hellaswag|10_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:48:31.608881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:48:31.608881.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T21:48:31.608881.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T21:48:31.608881.parquet'
- config_name: results
data_files:
- split: 2023_08_30T21_48_31.608881
path:
- results_2023-08-30T21:48:31.608881.parquet
- split: latest
path:
- results_2023-08-30T21:48:31.608881.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-7b-2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-7b-2.1](https://huggingface.co/jondurbin/airoboros-l2-7b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T21:48:31.608881](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-2.1/blob/main/results_2023-08-30T21%3A48%3A31.608881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.44818395354859086,
"acc_stderr": 0.035127835633127005,
"acc_norm": 0.4519953391460999,
"acc_norm_stderr": 0.03511317110865538,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.43954718122913994,
"mc2_stderr": 0.014710608089324633
},
"harness|arc:challenge|25": {
"acc": 0.5110921501706485,
"acc_stderr": 0.01460779491401305,
"acc_norm": 0.5443686006825939,
"acc_norm_stderr": 0.014553749939306864
},
"harness|hellaswag|10": {
"acc": 0.5952001593308106,
"acc_stderr": 0.004898501014225839,
"acc_norm": 0.7867954590718981,
"acc_norm_stderr": 0.004087339045106304
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4528301886792453,
"acc_stderr": 0.03063562795796182,
"acc_norm": 0.4528301886792453,
"acc_norm_stderr": 0.03063562795796182
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3815028901734104,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.3815028901734104,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.0379328118530781,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.0379328118530781
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.0236369759961018,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.0236369759961018
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4838709677419355,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.4838709677419355,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.033661244890514495,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.033661244890514495
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6303030303030303,
"acc_stderr": 0.03769430314512567,
"acc_norm": 0.6303030303030303,
"acc_norm_stderr": 0.03769430314512567
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5656565656565656,
"acc_stderr": 0.035315058793591834,
"acc_norm": 0.5656565656565656,
"acc_norm_stderr": 0.035315058793591834
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3923076923076923,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.3923076923076923,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145658,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145658
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40336134453781514,
"acc_stderr": 0.031866081214088314,
"acc_norm": 0.40336134453781514,
"acc_norm_stderr": 0.031866081214088314
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6,
"acc_stderr": 0.021004201260420075,
"acc_norm": 0.6,
"acc_norm_stderr": 0.021004201260420075
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0291575221846056,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0291575221846056
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239171,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239171
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6371308016877637,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.6371308016877637,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.46564885496183206,
"acc_stderr": 0.04374928560599738,
"acc_norm": 0.46564885496183206,
"acc_norm_stderr": 0.04374928560599738
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319772,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319772
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.031075028526507748,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.031075028526507748
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6079182630906769,
"acc_stderr": 0.01745852405014764,
"acc_norm": 0.6079182630906769,
"acc_norm_stderr": 0.01745852405014764
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.02691189868637792,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.02691189868637792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.028568699752225868,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.028568699752225868
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.02823776942208533,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.02823776942208533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4660493827160494,
"acc_stderr": 0.02775653525734767,
"acc_norm": 0.4660493827160494,
"acc_norm_stderr": 0.02775653525734767
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.02914454478159615,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.02914454478159615
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3474576271186441,
"acc_stderr": 0.0121614177297498,
"acc_norm": 0.3474576271186441,
"acc_norm_stderr": 0.0121614177297498
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4297385620915033,
"acc_stderr": 0.020027122784928558,
"acc_norm": 0.4297385620915033,
"acc_norm_stderr": 0.020027122784928558
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49387755102040815,
"acc_stderr": 0.03200682020163908,
"acc_norm": 0.49387755102040815,
"acc_norm_stderr": 0.03200682020163908
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5621890547263682,
"acc_stderr": 0.035080801121998406,
"acc_norm": 0.5621890547263682,
"acc_norm_stderr": 0.035080801121998406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6491228070175439,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.6491228070175439,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.43954718122913994,
"mc2_stderr": 0.014710608089324633
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_conceptofmind__Open-LLongMA-3b | 2023-08-30T22:00:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of conceptofmind/Open-LLongMA-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [conceptofmind/Open-LLongMA-3b](https://huggingface.co/conceptofmind/Open-LLongMA-3b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_conceptofmind__Open-LLongMA-3b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T21:59:22.661580](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Open-LLongMA-3b/blob/main/results_2023-08-30T21%3A59%3A22.661580.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2557876003278525,\n\
\ \"acc_stderr\": 0.03147445441526624,\n \"acc_norm\": 0.25883980680254576,\n\
\ \"acc_norm_stderr\": 0.031472248225636705,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.345076271513504,\n\
\ \"mc2_stderr\": 0.013239849784853331\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38054607508532423,\n \"acc_stderr\": 0.014188277712349824,\n\
\ \"acc_norm\": 0.39761092150170646,\n \"acc_norm_stderr\": 0.014301752223279538\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.49153555068711413,\n\
\ \"acc_stderr\": 0.004989066355449554,\n \"acc_norm\": 0.6545508862776339,\n\
\ \"acc_norm_stderr\": 0.004745426656377574\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724067,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.19,\n \"acc_stderr\": 0.03942772444036622,\n \"acc_norm\"\
: 0.19,\n \"acc_norm_stderr\": 0.03942772444036622\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770861,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770861\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302053,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302053\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n\
\ \"acc_stderr\": 0.024892469172462846,\n \"acc_norm\": 0.25806451612903225,\n\
\ \"acc_norm_stderr\": 0.024892469172462846\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.22564102564102564,\n \"acc_stderr\": 0.021193632525148533,\n\
\ \"acc_norm\": 0.22564102564102564,\n \"acc_norm_stderr\": 0.021193632525148533\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.02738140692786897,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.02738140692786897\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.19907407407407407,\n \"acc_stderr\": 0.02723229846269023,\n\
\ \"acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.02723229846269023\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.029178682304842544,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.029178682304842544\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2231404958677686,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.2231404958677686,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.20245398773006135,\n \"acc_stderr\": 0.03157065078911902,\n\
\ \"acc_norm\": 0.20245398773006135,\n \"acc_norm_stderr\": 0.03157065078911902\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.02934311479809447,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.02934311479809447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n\
\ \"acc_stderr\": 0.016225017944770957,\n \"acc_norm\": 0.28991060025542786,\n\
\ \"acc_norm_stderr\": 0.016225017944770957\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044283,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044283\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.02463004897982476,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.02463004897982476\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.025494259350694905,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.025494259350694905\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676653,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676653\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.026303648393696036,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.026303648393696036\n \
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.01740181671142766,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.01740181671142766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.03629335329947861,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.03629335329947861\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023493,\n \"mc2\": 0.345076271513504,\n\
\ \"mc2_stderr\": 0.013239849784853331\n }\n}\n```"
repo_url: https://huggingface.co/conceptofmind/Open-LLongMA-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|arc:challenge|25_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hellaswag|10_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:59:22.661580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T21:59:22.661580.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T21:59:22.661580.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T21:59:22.661580.parquet'
- config_name: results
data_files:
- split: 2023_08_30T21_59_22.661580
path:
- results_2023-08-30T21:59:22.661580.parquet
- split: latest
path:
- results_2023-08-30T21:59:22.661580.parquet
---
# Dataset Card for Evaluation run of conceptofmind/Open-LLongMA-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/conceptofmind/Open-LLongMA-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [conceptofmind/Open-LLongMA-3b](https://huggingface.co/conceptofmind/Open-LLongMA-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_conceptofmind__Open-LLongMA-3b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T21:59:22.661580](https://huggingface.co/datasets/open-llm-leaderboard/details_conceptofmind__Open-LLongMA-3b/blob/main/results_2023-08-30T21%3A59%3A22.661580.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2557876003278525,
"acc_stderr": 0.03147445441526624,
"acc_norm": 0.25883980680254576,
"acc_norm_stderr": 0.031472248225636705,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.345076271513504,
"mc2_stderr": 0.013239849784853331
},
"harness|arc:challenge|25": {
"acc": 0.38054607508532423,
"acc_stderr": 0.014188277712349824,
"acc_norm": 0.39761092150170646,
"acc_norm_stderr": 0.014301752223279538
},
"harness|hellaswag|10": {
"acc": 0.49153555068711413,
"acc_stderr": 0.004989066355449554,
"acc_norm": 0.6545508862776339,
"acc_norm_stderr": 0.004745426656377574
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724067,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036622,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036622
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198823,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198823
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770861,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770861
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302053,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302053
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462846,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462846
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.22564102564102564,
"acc_stderr": 0.021193632525148533,
"acc_norm": 0.22564102564102564,
"acc_norm_stderr": 0.021193632525148533
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.02738140692786897,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.02738140692786897
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.2,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.02723229846269023,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.02723229846269023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.029178682304842544,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.029178682304842544
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2231404958677686,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.2231404958677686,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.20245398773006135,
"acc_stderr": 0.03157065078911902,
"acc_norm": 0.20245398773006135,
"acc_norm_stderr": 0.03157065078911902
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809447,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770957,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770957
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044283,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.02463004897982476,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.02463004897982476
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.025494259350694905,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.025494259350694905
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676653,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676653
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25,
"acc_stderr": 0.026303648393696036,
"acc_norm": 0.25,
"acc_norm_stderr": 0.026303648393696036
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.01740181671142766,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.01740181671142766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.03629335329947861,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.03629335329947861
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023493,
"mc2": 0.345076271513504,
"mc2_stderr": 0.013239849784853331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AmelieSchreiber/cafa5_pickle_split | 2023-08-30T22:35:34.000Z | [
"license:mit",
"region:us"
] | AmelieSchreiber | null | null | null | 0 | 0 | ---
license: mit
---
|
TokenBender/hindi_convo_high_quality | 2023-09-01T09:47:34.000Z | [
"license:apache-2.0",
"region:us"
] | TokenBender | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ | 2023-08-30T22:08:11.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ](https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T22:06:48.097340](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ/blob/main/results_2023-08-30T22%3A06%3A48.097340.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.579449026070526,\n\
\ \"acc_stderr\": 0.034083790596487896,\n \"acc_norm\": 0.5831845343672756,\n\
\ \"acc_norm_stderr\": 0.03406436773103694,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5021851197251754,\n\
\ \"mc2_stderr\": 0.015571255623835292\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.591296928327645,\n \"acc_stderr\": 0.014365750345427001,\n\
\ \"acc_norm\": 0.6126279863481229,\n \"acc_norm_stderr\": 0.01423587248790987\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6223859788886676,\n\
\ \"acc_stderr\": 0.004837995637638539,\n \"acc_norm\": 0.821449910376419,\n\
\ \"acc_norm_stderr\": 0.0038219244335487754\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720685,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720685\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307695,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307695\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671746,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245265,\n\
\ \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245265\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5641025641025641,\n \"acc_stderr\": 0.025141801511177498,\n\
\ \"acc_norm\": 0.5641025641025641,\n \"acc_norm_stderr\": 0.025141801511177498\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416402,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416402\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.02390232554956041,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.02390232554956041\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.015190473717037497,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.015190473717037497\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.0257228022008958,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.0257228022008958\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47374301675977654,\n\
\ \"acc_stderr\": 0.01669942767278477,\n \"acc_norm\": 0.47374301675977654,\n\
\ \"acc_norm_stderr\": 0.01669942767278477\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.02736359328468496,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.02736359328468496\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4348109517601043,\n\
\ \"acc_stderr\": 0.012661233805616292,\n \"acc_norm\": 0.4348109517601043,\n\
\ \"acc_norm_stderr\": 0.012661233805616292\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.03033257809455502,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.03033257809455502\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061173,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061173\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595957,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595957\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533214,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533214\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5021851197251754,\n\
\ \"mc2_stderr\": 0.015571255623835292\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|arc:challenge|25_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hellaswag|10_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T22:06:48.097340.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T22:06:48.097340.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T22:06:48.097340.parquet'
- config_name: results
data_files:
- split: 2023_08_30T22_06_48.097340
path:
- results_2023-08-30T22:06:48.097340.parquet
- split: latest
path:
- results_2023-08-30T22:06:48.097340.parquet
---
# Dataset Card for Evaluation run of TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ](https://huggingface.co/TheBloke/OpenOrcaxOpenChat-Preview2-13B-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T22:06:48.097340](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__OpenOrcaxOpenChat-Preview2-13B-GPTQ/blob/main/results_2023-08-30T22%3A06%3A48.097340.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.579449026070526,
"acc_stderr": 0.034083790596487896,
"acc_norm": 0.5831845343672756,
"acc_norm_stderr": 0.03406436773103694,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5021851197251754,
"mc2_stderr": 0.015571255623835292
},
"harness|arc:challenge|25": {
"acc": 0.591296928327645,
"acc_stderr": 0.014365750345427001,
"acc_norm": 0.6126279863481229,
"acc_norm_stderr": 0.01423587248790987
},
"harness|hellaswag|10": {
"acc": 0.6223859788886676,
"acc_stderr": 0.004837995637638539,
"acc_norm": 0.821449910376419,
"acc_norm_stderr": 0.0038219244335487754
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720685,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720685
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307695,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307695
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671746,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245265,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245265
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5641025641025641,
"acc_stderr": 0.025141801511177498,
"acc_norm": 0.5641025641025641,
"acc_norm_stderr": 0.025141801511177498
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416402,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416402
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.02390232554956041,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.02390232554956041
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.015190473717037497,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.015190473717037497
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.0257228022008958,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.0257228022008958
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47374301675977654,
"acc_stderr": 0.01669942767278477,
"acc_norm": 0.47374301675977654,
"acc_norm_stderr": 0.01669942767278477
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.02736359328468496,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.02736359328468496
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4348109517601043,
"acc_stderr": 0.012661233805616292,
"acc_norm": 0.4348109517601043,
"acc_norm_stderr": 0.012661233805616292
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.03033257809455502,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.03033257809455502
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061173,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061173
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533214,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533214
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5021851197251754,
"mc2_stderr": 0.015571255623835292
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
loubnabnl/dummy_1 | 2023-08-31T10:14:51.000Z | [
"region:us"
] | loubnabnl | null | null | null | 0 | 0 | Entry not found |
dshut002/Mermaid_LLAMA_SimpleInstruct | 2023-08-30T22:45:49.000Z | [
"region:us"
] | dshut002 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 500
num_examples: 1
download_size: 4587
dataset_size: 500
---
# Dataset Card for "Mermaid_LLAMA_SimpleInstruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
raycast6000/elias-p4-recordings | 2023-08-30T23:28:40.000Z | [
"license:openrail",
"region:us"
] | raycast6000 | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.5e-13b | 2023-08-30T23:38:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b](https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.5e-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T23:37:31.114358](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.5e-13b/blob/main/results_2023-08-30T23%3A37%3A31.114358.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5724854113753837,\n\
\ \"acc_stderr\": 0.03436338726163606,\n \"acc_norm\": 0.5766543108597004,\n\
\ \"acc_norm_stderr\": 0.03434541534553873,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.4803942259941586,\n\
\ \"mc2_stderr\": 0.015171048096844537\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5366894197952219,\n \"acc_stderr\": 0.01457200052775699,\n\
\ \"acc_norm\": 0.5802047781569966,\n \"acc_norm_stderr\": 0.014422181226303026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5990838478390759,\n\
\ \"acc_stderr\": 0.004890824718530301,\n \"acc_norm\": 0.8015335590519816,\n\
\ \"acc_norm_stderr\": 0.00398030097024142\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.029946498567699948,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.029946498567699948\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.34656084656084657,\n \"acc_stderr\": 0.024508777521028424,\n \"\
acc_norm\": 0.34656084656084657,\n \"acc_norm_stderr\": 0.024508777521028424\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724356,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724356\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43349753694581283,\n \"acc_stderr\": 0.03486731727419872,\n\
\ \"acc_norm\": 0.43349753694581283,\n \"acc_norm_stderr\": 0.03486731727419872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.035014387062967806,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.035014387062967806\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.031353050095330855,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.031353050095330855\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5871794871794872,\n \"acc_stderr\": 0.024962683564331806,\n\
\ \"acc_norm\": 0.5871794871794872,\n \"acc_norm_stderr\": 0.024962683564331806\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7339449541284404,\n\
\ \"acc_stderr\": 0.018946022322225604,\n \"acc_norm\": 0.7339449541284404,\n\
\ \"acc_norm_stderr\": 0.018946022322225604\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n\
\ \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009164,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009164\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7343550446998723,\n\
\ \"acc_stderr\": 0.015794302487888726,\n \"acc_norm\": 0.7343550446998723,\n\
\ \"acc_norm_stderr\": 0.015794302487888726\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6502890173410405,\n \"acc_stderr\": 0.025674281456531018,\n\
\ \"acc_norm\": 0.6502890173410405,\n \"acc_norm_stderr\": 0.025674281456531018\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46368715083798884,\n\
\ \"acc_stderr\": 0.01667834189453317,\n \"acc_norm\": 0.46368715083798884,\n\
\ \"acc_norm_stderr\": 0.01667834189453317\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.02715520810320087,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.02715520810320087\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037096,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037096\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729148,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729148\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.019944914136873583,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.019944914136873583\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.03203841040213322,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.03203841040213322\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.4803942259941586,\n\
\ \"mc2_stderr\": 0.015171048096844537\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|arc:challenge|25_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hellaswag|10_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T23:37:31.114358.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T23:37:31.114358.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T23:37:31.114358.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T23:37:31.114358.parquet'
- config_name: results
data_files:
- split: 2023_08_30T23_37_31.114358
path:
- results_2023-08-30T23:37:31.114358.parquet
- split: latest
path:
- results_2023-08-30T23:37:31.114358.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b](https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.5e-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.5e-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T23:37:31.114358](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.5e-13b/blob/main/results_2023-08-30T23%3A37%3A31.114358.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5724854113753837,
"acc_stderr": 0.03436338726163606,
"acc_norm": 0.5766543108597004,
"acc_norm_stderr": 0.03434541534553873,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.4803942259941586,
"mc2_stderr": 0.015171048096844537
},
"harness|arc:challenge|25": {
"acc": 0.5366894197952219,
"acc_stderr": 0.01457200052775699,
"acc_norm": 0.5802047781569966,
"acc_norm_stderr": 0.014422181226303026
},
"harness|hellaswag|10": {
"acc": 0.5990838478390759,
"acc_stderr": 0.004890824718530301,
"acc_norm": 0.8015335590519816,
"acc_norm_stderr": 0.00398030097024142
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.029946498567699948,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.029946498567699948
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.34656084656084657,
"acc_stderr": 0.024508777521028424,
"acc_norm": 0.34656084656084657,
"acc_norm_stderr": 0.024508777521028424
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724356,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724356
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43349753694581283,
"acc_stderr": 0.03486731727419872,
"acc_norm": 0.43349753694581283,
"acc_norm_stderr": 0.03486731727419872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.035014387062967806,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.035014387062967806
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.031353050095330855,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.031353050095330855
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5871794871794872,
"acc_stderr": 0.024962683564331806,
"acc_norm": 0.5871794871794872,
"acc_norm_stderr": 0.024962683564331806
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7339449541284404,
"acc_stderr": 0.018946022322225604,
"acc_norm": 0.7339449541284404,
"acc_norm_stderr": 0.018946022322225604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009164,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009164
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7343550446998723,
"acc_stderr": 0.015794302487888726,
"acc_norm": 0.7343550446998723,
"acc_norm_stderr": 0.015794302487888726
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6502890173410405,
"acc_stderr": 0.025674281456531018,
"acc_norm": 0.6502890173410405,
"acc_norm_stderr": 0.025674281456531018
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46368715083798884,
"acc_stderr": 0.01667834189453317,
"acc_norm": 0.46368715083798884,
"acc_norm_stderr": 0.01667834189453317
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.02715520810320087,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.02715520810320087
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037096,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037096
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729148,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729148
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.019944914136873583,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.019944914136873583
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213322,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213322
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.4803942259941586,
"mc2_stderr": 0.015171048096844537
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FervGuida/SERJ | 2023-08-30T23:58:25.000Z | [
"region:us"
] | FervGuida | null | null | null | 0 | 0 | Entry not found |
purelife/XV5 | 2023-08-31T00:05:40.000Z | [
"license:openrail",
"region:us"
] | purelife | null | null | null | 0 | 0 | ---
license: openrail
---
|
yzhuang/autotree_automl_electricity_gosdt_l256_d3_sd0 | 2023-08-31T00:12:31.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2773600000
num_examples: 100000
- name: validation
num_bytes: 277360000
num_examples: 10000
download_size: 691921046
dataset_size: 3050960000
---
# Dataset Card for "autotree_automl_electricity_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama_2_product_titles-esci_train_all | 2023-08-31T03:34:06.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: label
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 34404870
num_examples: 16404
download_size: 0
dataset_size: 34404870
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_product_titles-esci_train_all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama_2-product-titles-esci-train-all-temp | 2023-08-31T10:40:21.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: label
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 6964939
num_examples: 3060
download_size: 1087545
dataset_size: 6964939
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-product-titles-esci-train-all-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ | 2023-08-31T00:31:57.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Genz-70b-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Genz-70b-GPTQ](https://huggingface.co/TheBloke/Genz-70b-GPTQ) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T00:30:34.342002](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ/blob/main/results_2023-08-31T00%3A30%3A34.342002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7017249416277331,\n\
\ \"acc_stderr\": 0.030832772804323012,\n \"acc_norm\": 0.70569345061239,\n\
\ \"acc_norm_stderr\": 0.03080075128019408,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6228267270427654,\n\
\ \"mc2_stderr\": 0.014836432877772263\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6638225255972696,\n \"acc_stderr\": 0.013804855026205763,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393443\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.689205337582155,\n\
\ \"acc_stderr\": 0.004618730353217047,\n \"acc_norm\": 0.8764190400318662,\n\
\ \"acc_norm_stderr\": 0.0032843028764223\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8223684210526315,\n \"acc_stderr\": 0.03110318238312338,\n\
\ \"acc_norm\": 0.8223684210526315,\n \"acc_norm_stderr\": 0.03110318238312338\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.04013124195424386,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.04013124195424386\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.02557625706125384,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.02557625706125384\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8290322580645161,\n\
\ \"acc_stderr\": 0.02141724293632159,\n \"acc_norm\": 0.8290322580645161,\n\
\ \"acc_norm_stderr\": 0.02141724293632159\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8545454545454545,\n \"acc_stderr\": 0.027530196355066573,\n\
\ \"acc_norm\": 0.8545454545454545,\n \"acc_norm_stderr\": 0.027530196355066573\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822523,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822523\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9378238341968912,\n \"acc_stderr\": 0.017426974154240528,\n\
\ \"acc_norm\": 0.9378238341968912,\n \"acc_norm_stderr\": 0.017426974154240528\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.023119362758232294,\n\
\ \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.023119362758232294\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279472,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279472\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611769,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611769\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073312,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640262,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640262\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.026241132996407252,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.026241132996407252\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002157,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002157\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9102564102564102,\n\
\ \"acc_stderr\": 0.018724301741941642,\n \"acc_norm\": 0.9102564102564102,\n\
\ \"acc_norm_stderr\": 0.018724301741941642\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8722860791826309,\n\
\ \"acc_stderr\": 0.011935626313999876,\n \"acc_norm\": 0.8722860791826309,\n\
\ \"acc_norm_stderr\": 0.011935626313999876\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8005780346820809,\n \"acc_stderr\": 0.021511900654252562,\n\
\ \"acc_norm\": 0.8005780346820809,\n \"acc_norm_stderr\": 0.021511900654252562\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5754189944134078,\n\
\ \"acc_stderr\": 0.01653117099327888,\n \"acc_norm\": 0.5754189944134078,\n\
\ \"acc_norm_stderr\": 0.01653117099327888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n\
\ \"acc_stderr\": 0.023839303311398205,\n \"acc_norm\": 0.7717041800643086,\n\
\ \"acc_norm_stderr\": 0.023839303311398205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8487654320987654,\n \"acc_stderr\": 0.019935086092149897,\n\
\ \"acc_norm\": 0.8487654320987654,\n \"acc_norm_stderr\": 0.019935086092149897\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5534550195567145,\n\
\ \"acc_stderr\": 0.012697046024399654,\n \"acc_norm\": 0.5534550195567145,\n\
\ \"acc_norm_stderr\": 0.012697046024399654\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103135,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103135\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7630718954248366,\n \"acc_stderr\": 0.017201662169789772,\n \
\ \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.017201662169789772\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.025172984350155754,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.025172984350155754\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.01734120239498826,\n \"mc2\": 0.6228267270427654,\n\
\ \"mc2_stderr\": 0.014836432877772263\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Genz-70b-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|arc:challenge|25_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hellaswag|10_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T00:30:34.342002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T00:30:34.342002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T00:30:34.342002.parquet'
- config_name: results
data_files:
- split: 2023_08_31T00_30_34.342002
path:
- results_2023-08-31T00:30:34.342002.parquet
- split: latest
path:
- results_2023-08-31T00:30:34.342002.parquet
---
# Dataset Card for Evaluation run of TheBloke/Genz-70b-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Genz-70b-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Genz-70b-GPTQ](https://huggingface.co/TheBloke/Genz-70b-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T00:30:34.342002](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Genz-70b-GPTQ/blob/main/results_2023-08-31T00%3A30%3A34.342002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7017249416277331,
"acc_stderr": 0.030832772804323012,
"acc_norm": 0.70569345061239,
"acc_norm_stderr": 0.03080075128019408,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6228267270427654,
"mc2_stderr": 0.014836432877772263
},
"harness|arc:challenge|25": {
"acc": 0.6638225255972696,
"acc_stderr": 0.013804855026205763,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393443
},
"harness|hellaswag|10": {
"acc": 0.689205337582155,
"acc_stderr": 0.004618730353217047,
"acc_norm": 0.8764190400318662,
"acc_norm_stderr": 0.0032843028764223
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8223684210526315,
"acc_stderr": 0.03110318238312338,
"acc_norm": 0.8223684210526315,
"acc_norm_stderr": 0.03110318238312338
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610337,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.04013124195424386,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.04013124195424386
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.02557625706125384,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.02557625706125384
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8290322580645161,
"acc_stderr": 0.02141724293632159,
"acc_norm": 0.8290322580645161,
"acc_norm_stderr": 0.02141724293632159
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8545454545454545,
"acc_stderr": 0.027530196355066573,
"acc_norm": 0.8545454545454545,
"acc_norm_stderr": 0.027530196355066573
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822523,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822523
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9378238341968912,
"acc_stderr": 0.017426974154240528,
"acc_norm": 0.9378238341968912,
"acc_norm_stderr": 0.017426974154240528
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.023119362758232294,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.023119362758232294
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608463,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608463
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279472,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279472
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611769,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611769
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073312,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640262,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640262
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.026241132996407252,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.026241132996407252
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002157,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002157
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9102564102564102,
"acc_stderr": 0.018724301741941642,
"acc_norm": 0.9102564102564102,
"acc_norm_stderr": 0.018724301741941642
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8722860791826309,
"acc_stderr": 0.011935626313999876,
"acc_norm": 0.8722860791826309,
"acc_norm_stderr": 0.011935626313999876
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8005780346820809,
"acc_stderr": 0.021511900654252562,
"acc_norm": 0.8005780346820809,
"acc_norm_stderr": 0.021511900654252562
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5754189944134078,
"acc_stderr": 0.01653117099327888,
"acc_norm": 0.5754189944134078,
"acc_norm_stderr": 0.01653117099327888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.023839303311398205,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.023839303311398205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8487654320987654,
"acc_stderr": 0.019935086092149897,
"acc_norm": 0.8487654320987654,
"acc_norm_stderr": 0.019935086092149897
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5534550195567145,
"acc_stderr": 0.012697046024399654,
"acc_norm": 0.5534550195567145,
"acc_norm_stderr": 0.012697046024399654
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103135,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103135
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.017201662169789772,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.017201662169789772
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.025172984350155754,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.025172984350155754
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.01734120239498826,
"mc2": 0.6228267270427654,
"mc2_stderr": 0.014836432877772263
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
qazisaad/llama_2_optimized_product_titles-esci-4-7 | 2023-08-31T01:13:12.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 28182010
num_examples: 9861
download_size: 4414697
dataset_size: 28182010
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-4-7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama-2-optimized-product-titles-esci-4-7-temp | 2023-08-31T10:45:55.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 11551253
num_examples: 3660
download_size: 2198115
dataset_size: 11551253
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama-2-optimized-product-titles-esci-4-7-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Admin08077/Shh | 2023-08-31T01:00:33.000Z | [
"license:other",
"region:us"
] | Admin08077 | null | null | null | 0 | 0 | ---
license: other
---
|
qazisaad/llama_2_optimized_product_titles-esci-test | 2023-08-31T01:11:20.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 33872893
num_examples: 11924
download_size: 5228397
dataset_size: 33872893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_snnxor_n25_l1_2 | 2023-08-31T01:18:59.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 123760000
num_examples: 10000
- name: validation
num_bytes: 123760000
num_examples: 10000
- name: test
num_bytes: 123760000
num_examples: 10000
download_size: 183565390
dataset_size: 371280000
---
# Dataset Card for "autotree_snnxor_n25_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama-2-optimized-product-titles-esci-test-temp | 2023-08-31T11:28:31.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 11865828
num_examples: 3780
download_size: 2246163
dataset_size: 11865828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama-2-optimized-product-titles-esci-test-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_snnxor_n5_l1_2 | 2023-08-31T01:24:27.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 123760000
num_examples: 10000
- name: validation
num_bytes: 123760000
num_examples: 10000
- name: test
num_bytes: 123760000
num_examples: 10000
download_size: 183553539
dataset_size: 371280000
---
# Dataset Card for "autotree_snnxor_n5_l1_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama_2-product-titles-esci-test-temp | 2023-08-31T10:45:37.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: label
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9174461
num_examples: 4140
download_size: 1472396
dataset_size: 9174461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-product-titles-esci-test-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlvianKhairi/my-pandas-dataset-Abstract_No_Link_270k | 2023-08-31T01:44:12.000Z | [
"region:us"
] | AlvianKhairi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 311885598
num_examples: 276037
download_size: 154449829
dataset_size: 311885598
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my-pandas-dataset-Abstract_No_Link_270k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jakir057/bangladeshi_banknotes_70k | 2023-08-31T01:56:16.000Z | [
"region:us"
] | Jakir057 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '100'
'2': '1000'
'3': '2'
'4': '20'
'5': '5'
'6': '50'
'7': '500'
splits:
- name: train
num_bytes: 5185934905.075951
num_examples: 59960
- name: test
num_bytes: 921148214.9020498
num_examples: 10582
download_size: 6163263960
dataset_size: 6107083119.978001
---
# Dataset Card for "bangladeshi_banknotes_70k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlvianKhairi/my-pandas-dataset-Abstract_No_Link_100k | 2023-08-31T01:58:34.000Z | [
"region:us"
] | AlvianKhairi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 118205763
num_examples: 100000
download_size: 58583449
dataset_size: 118205763
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my-pandas-dataset-Abstract_No_Link_100k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w | 2023-08-31T02:01:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T02:00:30.658003](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w/blob/main/results_2023-08-31T02%3A00%3A30.658003.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5442961947044164,\n\
\ \"acc_stderr\": 0.034390939607832384,\n \"acc_norm\": 0.5485046428987514,\n\
\ \"acc_norm_stderr\": 0.034371447823118406,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.38171207935866835,\n\
\ \"mc2_stderr\": 0.014064032107804176\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5520477815699659,\n \"acc_stderr\": 0.01453201149821167,\n\
\ \"acc_norm\": 0.5947098976109215,\n \"acc_norm_stderr\": 0.014346869060229327\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6043616809400518,\n\
\ \"acc_stderr\": 0.0048798800921039595,\n \"acc_norm\": 0.8099980083648676,\n\
\ \"acc_norm_stderr\": 0.003915007231962101\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309173,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309173\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115205,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115205\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.041553199555931467,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.041553199555931467\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.03811890988940412,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.03811890988940412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.031709956060406545,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.031709956060406545\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376893,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.027709359675032495,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.027709359675032495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649037,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649037\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6717171717171717,\n \"acc_stderr\": 0.03345678422756775,\n \"\
acc_norm\": 0.6717171717171717,\n \"acc_norm_stderr\": 0.03345678422756775\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799602,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799602\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228405,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228405\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.03221943636566196,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.03221943636566196\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7577981651376147,\n \"acc_stderr\": 0.01836817630659862,\n \"\
acc_norm\": 0.7577981651376147,\n \"acc_norm_stderr\": 0.01836817630659862\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643524,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.042059539338841226,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.042059539338841226\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7650063856960408,\n\
\ \"acc_stderr\": 0.015162024152278446,\n \"acc_norm\": 0.7650063856960408,\n\
\ \"acc_norm_stderr\": 0.015162024152278446\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.014776765066438893,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.014776765066438893\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809075,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809075\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.0266756119260371,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.0266756119260371\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.029316011776343555,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.029316011776343555\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39504563233376794,\n\
\ \"acc_stderr\": 0.012485727813251551,\n \"acc_norm\": 0.39504563233376794,\n\
\ \"acc_norm_stderr\": 0.012485727813251551\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.046737523336702384,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.046737523336702384\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n\
\ \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.38171207935866835,\n\
\ \"mc2_stderr\": 0.014064032107804176\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|arc:challenge|25_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hellaswag|10_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T02:00:30.658003.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T02:00:30.658003.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T02:00:30.658003.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T02:00:30.658003.parquet'
- config_name: results
data_files:
- split: 2023_08_31T02_00_30.658003
path:
- results_2023-08-31T02:00:30.658003.parquet
- split: latest
path:
- results_2023-08-31T02:00:30.658003.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w](https://huggingface.co/CHIH-HUNG/llama-2-13b-huangyt_Fintune_1_17w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T02:00:30.658003](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-huangyt_Fintune_1_17w/blob/main/results_2023-08-31T02%3A00%3A30.658003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5442961947044164,
"acc_stderr": 0.034390939607832384,
"acc_norm": 0.5485046428987514,
"acc_norm_stderr": 0.034371447823118406,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.38171207935866835,
"mc2_stderr": 0.014064032107804176
},
"harness|arc:challenge|25": {
"acc": 0.5520477815699659,
"acc_stderr": 0.01453201149821167,
"acc_norm": 0.5947098976109215,
"acc_norm_stderr": 0.014346869060229327
},
"harness|hellaswag|10": {
"acc": 0.6043616809400518,
"acc_stderr": 0.0048798800921039595,
"acc_norm": 0.8099980083648676,
"acc_norm_stderr": 0.003915007231962101
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309173,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309173
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115205,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115205
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.041553199555931467,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.041553199555931467
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.03811890988940412,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.03811890988940412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.031709956060406545,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.031709956060406545
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376893,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.027709359675032495,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.027709359675032495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649037,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649037
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6717171717171717,
"acc_stderr": 0.03345678422756775,
"acc_norm": 0.6717171717171717,
"acc_norm_stderr": 0.03345678422756775
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799602,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799602
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228405,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228405
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.03221943636566196,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.03221943636566196
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7577981651376147,
"acc_stderr": 0.01836817630659862,
"acc_norm": 0.7577981651376147,
"acc_norm_stderr": 0.01836817630659862
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643524,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.042059539338841226,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.042059539338841226
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7650063856960408,
"acc_stderr": 0.015162024152278446,
"acc_norm": 0.7650063856960408,
"acc_norm_stderr": 0.015162024152278446
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438893,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438893
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809075,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809075
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.027513925683549434,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.027513925683549434
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.0266756119260371,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.0266756119260371
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.029316011776343555,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.029316011776343555
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39504563233376794,
"acc_stderr": 0.012485727813251551,
"acc_norm": 0.39504563233376794,
"acc_norm_stderr": 0.012485727813251551
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.046737523336702384,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.046737523336702384
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.38171207935866835,
"mc2_stderr": 0.014064032107804176
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AlvianKhairi/my-pandas-dataset-Abstract_No_Link_50k | 2023-08-31T02:06:46.000Z | [
"region:us"
] | AlvianKhairi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 59563045
num_examples: 50000
download_size: 29621501
dataset_size: 59563045
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my-pandas-dataset-Abstract_No_Link_50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MathiasFoster/whisper-data | 2023-08-31T02:49:25.000Z | [
"region:us"
] | MathiasFoster | null | null | null | 0 | 0 | Entry not found |
OKR/8888 | 2023-09-01T06:47:35.000Z | [
"license:apache-2.0",
"region:us"
] | OKR | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
yamiletzii/chavez | 2023-08-31T02:30:04.000Z | [
"license:unknown",
"region:us"
] | yamiletzii | null | null | null | 0 | 0 | ---
license: unknown
---
|
AlvianKhairi/my-pandas-dataset-Abstract_No_Link_25k | 2023-08-31T02:32:52.000Z | [
"region:us"
] | AlvianKhairi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 30020414
num_examples: 25000
download_size: 14958534
dataset_size: 30020414
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my-pandas-dataset-Abstract_No_Link_25k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CC1984/mall_receipt_extraction_dataset | 2023-08-31T03:02:26.000Z | [
"license:mit",
"region:us"
] | CC1984 | null | null | null | 0 | 0 | ---
license: mit
---
|
hungshinlee/hakkaradio_sixian | 2023-08-31T06:51:43.000Z | [
"region:us"
] | hungshinlee | null | null | null | 0 | 0 | # hakkaradio_sixian
|
movie13/Soft | 2023-09-23T12:28:22.000Z | [
"region:us"
] | movie13 | null | null | null | 0 | 0 | Entry not found |
daisyjojo/deeprx_zipped | 2023-08-31T07:56:46.000Z | [
"license:other",
"region:us"
] | daisyjojo | null | null | null | 0 | 0 | ---
license: other
---
|
benjis/bigvul | 2023-08-31T03:02:50.000Z | [
"region:us"
] | benjis | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: CVE ID
dtype: string
- name: CVE Page
dtype: string
- name: CWE ID
dtype: string
- name: codeLink
dtype: string
- name: commit_id
dtype: string
- name: commit_message
dtype: string
- name: func_after
dtype: string
- name: func_before
dtype: string
- name: lang
dtype: string
- name: project
dtype: string
- name: vul
dtype: int8
splits:
- name: train
num_bytes: 404950685.2579571
num_examples: 150908
- name: validation
num_bytes: 88684597.21877055
num_examples: 33049
- name: test
num_bytes: 88687280.64632414
num_examples: 33050
download_size: 252969708
dataset_size: 582322563.1230518
---
# Dataset Card for "bigvul"
Unofficial, not affiliated with the authors.
- **Paper:** https://doi.org/10.1145/3379597.3387501
- **Repository:** https://github.com/ZeoVan/MSR_20_Code_vulnerability_CSV_Dataset |
alkaidavituxa/denden | 2023-08-31T03:03:58.000Z | [
"region:us"
] | alkaidavituxa | null | null | null | 0 | 0 | Entry not found |
leiwx52/ViT-Lens-dev | 2023-08-31T06:38:40.000Z | [
"region:us"
] | leiwx52 | null | null | null | 0 | 0 | Entry not found |
deodot/yudfgd | 2023-08-31T03:09:07.000Z | [
"region:us"
] | deodot | null | null | null | 0 | 0 | Entry not found |
xf7s4haw/kjkjhg | 2023-08-31T03:17:54.000Z | [
"region:us"
] | xf7s4haw | null | null | null | 0 | 0 | Entry not found |
wilsonpickett/oioygjf | 2023-08-31T03:22:51.000Z | [
"region:us"
] | wilsonpickett | null | null | null | 0 | 0 | Entry not found |
cjrxqekq/fhgdfhd54 | 2023-08-31T03:29:46.000Z | [
"region:us"
] | cjrxqekq | null | null | null | 0 | 0 | Entry not found |
anhnct/Gligen | 2023-08-31T03:35:36.000Z | [
"region:us"
] | anhnct | null | null | null | 0 | 0 | Entry not found |
RevengerArtz/JotaroKujoENG | 2023-08-31T03:40:04.000Z | [
"region:us"
] | RevengerArtz | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_covertype_gosdt_l256_d3_sd0 | 2023-08-31T03:45:48.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | Entry not found |
changjacHp/lol_champions_threat_synergy | 2023-09-04T05:58:50.000Z | [
"region:us"
] | changjacHp | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_california_gosdt_l256_d3_sd0 | 2023-08-31T03:51:42.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 2978400000
num_examples: 100000
- name: validation
num_bytes: 297840000
num_examples: 10000
download_size: 1134010508
dataset_size: 3276240000
---
# Dataset Card for "autotree_automl_california_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JamesStratford/voice-of-birds | 2023-08-31T08:25:41.000Z | [
"region:us"
] | JamesStratford | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': Andean Guan
'1': Andean Tinamou
'2': Australian Brushturkey
'3': Band-tailed Guan
'4': Barred Tinamou
'5': Bartletts Tinamou
'6': Baudo Guan
'7': Bearded Guan
'8': Berlepschs Tinamou
'9': Biak Scrubfowl
'10': Black Tinamou
'11': Black-billed Brushturkey
'12': Black-capped Tinamou
'13': Black-fronted Piping Guan
'14': Blue-throated Piping Guan
'15': Brazilian Tinamou
'16': Brown Tinamou
'17': Brushland Tinamou
'18': Buff-browed Chachalaca
'19': Cauca Guan
'20': Chaco Chachalaca
'21': Chestnut-bellied Guan
'22': Chestnut-headed Chachalaca
'23': Chestnut-winged Chachalaca
'24': Chilean Tinamou
'25': Choco Tinamou
'26': Cinereous Tinamou
'27': Collared Brushturkey
'28': Colombian Chachalaca
'29': Common Ostrich
'30': Crested Guan
'31': Curve-billed Tinamou
'32': Darwins Nothura
'33': Dusky Megapode
'34': Dusky-legged Guan
'35': Dwarf Cassowary
'36': Dwarf Tinamou
'37': East Brazilian Chachalaca
'38': Elegant Crested Tinamou
'39': Emu
'40': Great Spotted Kiwi
'41': Great Tinamou
'42': Greater Rhea
'43': Grey Tinamou
'44': Grey-headed Chachalaca
'45': Grey-legged Tinamou
'46': Highland Tinamou
'47': Hooded Tinamou
'48': Huayco Tinamou
'49': Lesser Nothura
'50': Lesser Rhea
'51': Little Chachalaca
'52': Little Spotted Kiwi
'53': Little Tinamou
'54': Maleo
'55': Malleefowl
'56': Marail Guan
'57': Melanesian Megapode
'58': Micronesian Megapode
'59': Moluccan Megapode
'60': New Guinea Scrubfowl
'61': Nicobar Megapode
'62': North Island Brown Kiwi
'63': Northern Cassowary
'64': Okarito Kiwi
'65': Orange-footed Scrubfowl
'66': Ornate Tinamou
'67': Pale-browed Tinamou
'68': Patagonian Tinamou
'69': Philippine Megapode
'70': Plain Chachalaca
'71': Puna Tinamou
'72': Quebracho Crested Tinamou
'73': Red-billed Brushturkey
'74': Red-faced Guan
'75': Red-legged Tinamou
'76': Red-throated Piping Guan
'77': Red-winged Tinamou
'78': Rufous-bellied Chachalaca
'79': Rufous-headed Chachalaca
'80': Rufous-vented Chachalaca
'81': Rusty Tinamou
'82': Rusty-margined Guan
'83': Scaled Chachalaca
'84': Slaty-breasted Tinamou
'85': Small-billed Tinamou
'86': Solitary Tinamou
'87': Somali Ostrich
'88': Southern Brown Kiwi
'89': Southern Cassowary
'90': Speckled Chachalaca
'91': Spixs Guan
'92': Spotted Nothura
'93': Sula Megapode
'94': Taczanowskis Tinamou
'95': Tanimbar Megapode
'96': Tataupa Tinamou
'97': Tawny-breasted Tinamou
'98': Tepui Tinamou
'99': Thicket Tinamou
'100': Tongan Megapode
'101': Trinidad Piping Guan
'102': Undulated Tinamou
'103': Vanuatu Megapode
'104': Variegated Tinamou
'105': Wattled Brushturkey
'106': West Mexican Chachalaca
'107': White-bellied Chachalaca
'108': White-bellied Nothura
'109': White-browed Guan
'110': White-crested Guan
'111': White-throated Tinamou
'112': White-winged Guan
'113': Yellow-legged Tinamou
splits:
- name: train
num_bytes: 4195346823.345
num_examples: 1723
- name: test
num_bytes: 1127981706.0
num_examples: 431
download_size: 3288384913
dataset_size: 5323328529.344999
---
# Dataset Card for "voice-of-birds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
josephilo/data1 | 2023-08-31T04:32:45.000Z | [
"region:us"
] | josephilo | null | null | null | 0 | 0 | Entry not found |
grthyujikuo/thyujyiulopk576 | 2023-08-31T04:28:02.000Z | [
"region:us"
] | grthyujikuo | null | null | null | 0 | 0 | Entry not found |
thyujyikuopk/gtrhytujyilopioi5768 | 2023-08-31T04:28:13.000Z | [
"region:us"
] | thyujyikuopk | null | null | null | 0 | 0 | Entry not found |
trhytujyikuop/trytujyikuopioi768 | 2023-08-31T04:28:18.000Z | [
"region:us"
] | trhytujyikuop | null | null | null | 0 | 0 | Entry not found |
thytujykiupo/rethrytujyiuopoi789 | 2023-08-31T04:28:21.000Z | [
"region:us"
] | thytujykiupo | null | null | null | 1 | 0 | Entry not found |
thyjuykiopok/htrytuykiuolik7789 | 2023-08-31T04:29:43.000Z | [
"region:us"
] | thyjuykiopok | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_pol_gosdt_l256_d3_sd0 | 2023-08-31T04:43:10.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6664800000
num_examples: 100000
- name: validation
num_bytes: 666480000
num_examples: 10000
download_size: 479767259
dataset_size: 7331280000
---
# Dataset Card for "autotree_automl_pol_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_pmlb_banana_sgosdt_l256_d3_sd0 | 2023-08-31T04:50:10.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 123760000
num_examples: 10000
- name: validation
num_bytes: 123760000
num_examples: 10000
download_size: 49313232
dataset_size: 247520000
---
# Dataset Card for "autotree_pmlb_banana_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rhtyuikuopk/rthyjutytgjuy54 | 2023-08-31T05:10:47.000Z | [
"region:us"
] | rhtyuikuopk | null | null | null | 0 | 0 | Entry not found |
rthyujyikuopu/thytujyikuok78 | 2023-08-31T05:10:52.000Z | [
"region:us"
] | rthyujyikuopu | null | null | null | 0 | 0 | Entry not found |
thytujyikuopo/tyjuykiuoipi76877 | 2023-08-31T05:11:38.000Z | [
"region:us"
] | thytujyikuopo | null | null | null | 0 | 0 | Entry not found |
drgthyujikuopio/rhytujyikuopokij76 | 2023-08-31T05:11:55.000Z | [
"region:us"
] | drgthyujikuopio | null | null | null | 0 | 0 | Entry not found |
thyjuikolipu/tyuiuopiok7687 | 2023-08-31T05:11:49.000Z | [
"region:us"
] | thyjuikolipu | null | null | null | 0 | 0 | Entry not found |
ashwincv0112/Code_Generation_avp | 2023-08-31T05:07:15.000Z | [
"region:us"
] | ashwincv0112 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Text Query
dtype: string
- name: Code Generated
dtype: string
splits:
- name: train
num_bytes: 1351
num_examples: 11
download_size: 2412
dataset_size: 1351
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Code_Generation_avp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NarchAI1992/Townhouse_ineriorv1 | 2023-08-31T05:18:58.000Z | [
"license:openrail",
"region:us"
] | NarchAI1992 | null | null | null | 0 | 0 | ---
license: openrail
---
|
linhqyy/result_with_finetuned_taggenv2_9epoch_encoder_embeddings | 2023-08-31T05:35:14.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371562.027
num_examples: 1299
download_size: 164200927
dataset_size: 174371562.027
---
# Dataset Card for "result_with_finetuned_taggenv2_9epoch_encoder_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_pmlb_spambase_sgosdt_l256_d3_sd0 | 2023-08-31T05:54:58.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 364942720
num_examples: 10000
- name: validation
num_bytes: 364893568
num_examples: 10000
download_size: 116435922
dataset_size: 729836288
---
# Dataset Card for "autotree_pmlb_spambase_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
linhqyy/result_with_finetuned_taggenv2_10epoch_encoder_embeddings_decoder_roberta | 2023-08-31T05:59:08.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371675.027
num_examples: 1299
download_size: 164200911
dataset_size: 174371675.027
---
# Dataset Card for "result_with_finetuned_taggenv2_10epoch_encoder_embeddings_decoder_roberta"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
datascience555/mock_test_comments | 2023-08-31T07:01:30.000Z | [
"region:us"
] | datascience555 | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_pmlb_phoneme_sgosdt_l256_d3_sd0 | 2023-08-31T06:35:17.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 154480000
num_examples: 10000
- name: validation
num_bytes: 154480000
num_examples: 10000
download_size: 67311028
dataset_size: 308960000
---
# Dataset Card for "autotree_pmlb_phoneme_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TheBloke__Llama-2-70B-chat-GPTQ | 2023-08-31T06:36:16.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Llama-2-70B-chat-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Llama-2-70B-chat-GPTQ](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-70B-chat-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T06:34:53.347292](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-70B-chat-GPTQ/blob/main/results_2023-08-31T06%3A34%3A53.347292.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6271417335583028,\n\
\ \"acc_stderr\": 0.032984385584095444,\n \"acc_norm\": 0.6311690860463957,\n\
\ \"acc_norm_stderr\": 0.032959972116116315,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.509833074268794,\n\
\ \"mc2_stderr\": 0.015705295947458175\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5810580204778157,\n \"acc_stderr\": 0.014418106953639011,\n\
\ \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759093\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6557458673571002,\n\
\ \"acc_stderr\": 0.0047415341064702835,\n \"acc_norm\": 0.8481378211511651,\n\
\ \"acc_norm_stderr\": 0.0035815378475817965\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6377358490566037,\n \"acc_stderr\": 0.029582245128384303,\n\
\ \"acc_norm\": 0.6377358490566037,\n \"acc_norm_stderr\": 0.029582245128384303\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7612903225806451,\n \"acc_stderr\": 0.024251071262208837,\n \"\
acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.024251071262208837\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n \"\
acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n\
\ \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587192,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587192\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650154,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650154\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801588,\n \"\
acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801588\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8396624472573839,\n \"acc_stderr\": 0.02388438092596567,\n \
\ \"acc_norm\": 0.8396624472573839,\n \"acc_norm_stderr\": 0.02388438092596567\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077812,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077812\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.01403694585038138,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.01403694585038138\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.015721531075183873,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.015721531075183873\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145905,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998482,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766002,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766002\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623553,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.027372942201788156,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.027372942201788156\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857833,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857833\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.509833074268794,\n\
\ \"mc2_stderr\": 0.015705295947458175\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:34:53.347292.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:34:53.347292.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:34:53.347292.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:34:53.347292.parquet'
- config_name: results
data_files:
- split: 2023_08_31T06_34_53.347292
path:
- results_2023-08-31T06:34:53.347292.parquet
- split: latest
path:
- results_2023-08-31T06:34:53.347292.parquet
---
# Dataset Card for Evaluation run of TheBloke/Llama-2-70B-chat-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-70B-chat-GPTQ](https://huggingface.co/TheBloke/Llama-2-70B-chat-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-70B-chat-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T06:34:53.347292](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-70B-chat-GPTQ/blob/main/results_2023-08-31T06%3A34%3A53.347292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6271417335583028,
"acc_stderr": 0.032984385584095444,
"acc_norm": 0.6311690860463957,
"acc_norm_stderr": 0.032959972116116315,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.509833074268794,
"mc2_stderr": 0.015705295947458175
},
"harness|arc:challenge|25": {
"acc": 0.5810580204778157,
"acc_stderr": 0.014418106953639011,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759093
},
"harness|hellaswag|10": {
"acc": 0.6557458673571002,
"acc_stderr": 0.0047415341064702835,
"acc_norm": 0.8481378211511651,
"acc_norm_stderr": 0.0035815378475817965
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6377358490566037,
"acc_stderr": 0.029582245128384303,
"acc_norm": 0.6377358490566037,
"acc_norm_stderr": 0.029582245128384303
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.024251071262208837,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.024251071262208837
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587192,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587192
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650154,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650154
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801588,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801588
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8396624472573839,
"acc_stderr": 0.02388438092596567,
"acc_norm": 0.8396624472573839,
"acc_norm_stderr": 0.02388438092596567
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077812,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077812
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.01403694585038138,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.01403694585038138
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.015721531075183873,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.015721531075183873
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145905,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998482,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766002,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766002
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623553,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.027372942201788156,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.027372942201788156
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857833,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857833
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.509833074268794,
"mc2_stderr": 0.015705295947458175
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Poupou/gr18 | 2023-09-06T09:58:43.000Z | [
"license:openrail",
"region:us"
] | Poupou | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ | 2023-08-31T06:47:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Lemur-70B-Chat-v1-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Lemur-70B-Chat-v1-GPTQ](https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T06:46:13.725525](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ/blob/main/results_2023-08-31T06%3A46%3A13.725525.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6468074911221942,\n\
\ \"acc_stderr\": 0.03281612856930076,\n \"acc_norm\": 0.6509040444920074,\n\
\ \"acc_norm_stderr\": 0.032790646231639874,\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5711470281396481,\n\
\ \"mc2_stderr\": 0.015283087726691595\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670724,\n\
\ \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620446\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6475801633140809,\n\
\ \"acc_stderr\": 0.00476747536668976,\n \"acc_norm\": 0.8440549691296555,\n\
\ \"acc_norm_stderr\": 0.003620617550747387\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119667,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119667\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4603174603174603,\n \"acc_stderr\": 0.025670080636909186,\n \"\
acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.025670080636909186\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228426,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228426\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.04026141497634611,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.04026141497634611\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976064,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8354430379746836,\n \"acc_stderr\": 0.024135736240566932,\n \
\ \"acc_norm\": 0.8354430379746836,\n \"acc_norm_stderr\": 0.024135736240566932\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001512,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001512\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5072625698324023,\n\
\ \"acc_stderr\": 0.0167207374051795,\n \"acc_norm\": 0.5072625698324023,\n\
\ \"acc_norm_stderr\": 0.0167207374051795\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279046,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279046\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49934810951760106,\n\
\ \"acc_stderr\": 0.012770225252255534,\n \"acc_norm\": 0.49934810951760106,\n\
\ \"acc_norm_stderr\": 0.012770225252255534\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n\
\ \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7918367346938775,\n \"acc_stderr\": 0.025991117672813296,\n\
\ \"acc_norm\": 0.7918367346938775,\n \"acc_norm_stderr\": 0.025991117672813296\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.038913644958358175,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.038913644958358175\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338734,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338734\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.01700810193916349,\n \"mc2\": 0.5711470281396481,\n\
\ \"mc2_stderr\": 0.015283087726691595\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:46:13.725525.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:46:13.725525.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:46:13.725525.parquet'
- config_name: results
data_files:
- split: 2023_08_31T06_46_13.725525
path:
- results_2023-08-31T06:46:13.725525.parquet
- split: latest
path:
- results_2023-08-31T06:46:13.725525.parquet
---
# Dataset Card for Evaluation run of TheBloke/Lemur-70B-Chat-v1-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Lemur-70B-Chat-v1-GPTQ](https://huggingface.co/TheBloke/Lemur-70B-Chat-v1-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T06:46:13.725525](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Lemur-70B-Chat-v1-GPTQ/blob/main/results_2023-08-31T06%3A46%3A13.725525.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6468074911221942,
"acc_stderr": 0.03281612856930076,
"acc_norm": 0.6509040444920074,
"acc_norm_stderr": 0.032790646231639874,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5711470281396481,
"mc2_stderr": 0.015283087726691595
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670724,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620446
},
"harness|hellaswag|10": {
"acc": 0.6475801633140809,
"acc_stderr": 0.00476747536668976,
"acc_norm": 0.8440549691296555,
"acc_norm_stderr": 0.003620617550747387
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.025670080636909186,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.025670080636909186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228426,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228426
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.04026141497634611,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.04026141497634611
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976064,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8354430379746836,
"acc_stderr": 0.024135736240566932,
"acc_norm": 0.8354430379746836,
"acc_norm_stderr": 0.024135736240566932
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001512,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001512
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5072625698324023,
"acc_stderr": 0.0167207374051795,
"acc_norm": 0.5072625698324023,
"acc_norm_stderr": 0.0167207374051795
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279046,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279046
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49934810951760106,
"acc_stderr": 0.012770225252255534,
"acc_norm": 0.49934810951760106,
"acc_norm_stderr": 0.012770225252255534
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7918367346938775,
"acc_stderr": 0.025991117672813296,
"acc_norm": 0.7918367346938775,
"acc_norm_stderr": 0.025991117672813296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.038913644958358175,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.038913644958358175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338734,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338734
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.01700810193916349,
"mc2": 0.5711470281396481,
"mc2_stderr": 0.015283087726691595
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
linhqyy/result_with_finetuned_taggenv2_40epoch_unfreeze | 2023-08-31T06:49:23.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371470.027
num_examples: 1299
download_size: 164200894
dataset_size: 174371470.027
---
# Dataset Card for "result_with_finetuned_taggenv2_40epoch_unfreeze"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ondiet/bert_model | 2023-08-31T06:57:29.000Z | [
"license:openrail",
"region:us"
] | Ondiet | null | null | null | 0 | 0 | ---
license: openrail
---
|
junlee666/test3 | 2023-08-31T06:58:43.000Z | [
"region:us"
] | junlee666 | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.