id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
arubenruben/dummy | 2023-09-27T19:16:35.000Z | [
"region:us"
] | arubenruben | null | null | null | 0 | 0 | |
open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v4.5 | 2023-08-29T22:38:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of The-Face-Of-Goonery/Huginn-13b-v4.5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [The-Face-Of-Goonery/Huginn-13b-v4.5](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v4.5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v4.5\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T22:38:13.203801](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v4.5/blob/main/results_2023-08-29T22%3A38%3A13.203801.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5259069198696034,\n\
\ \"acc_stderr\": 0.034732780984597124,\n \"acc_norm\": 0.5297450802692631,\n\
\ \"acc_norm_stderr\": 0.03471265443522423,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.506195531543039,\n\
\ \"mc2_stderr\": 0.01543396728769934\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6285600477992431,\n\
\ \"acc_stderr\": 0.004822022254886021,\n \"acc_norm\": 0.8234415455088627,\n\
\ \"acc_norm_stderr\": 0.0038051533447130874\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n\
\ \"acc_stderr\": 0.04177578950739993,\n \"acc_norm\": 0.5208333333333334,\n\
\ \"acc_norm_stderr\": 0.04177578950739993\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006715,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006715\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300645,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300645\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.03332299921070644,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.03332299921070644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\
\ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5210084033613446,\n \"acc_stderr\": 0.03244980849990029,\n\
\ \"acc_norm\": 0.5210084033613446,\n \"acc_norm_stderr\": 0.03244980849990029\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.710091743119266,\n\
\ \"acc_stderr\": 0.019453066609201597,\n \"acc_norm\": 0.710091743119266,\n\
\ \"acc_norm_stderr\": 0.019453066609201597\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n\
\ \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.032282103870378914,\n \"\
acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378914\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6877637130801688,\n \"acc_stderr\": 0.03016513786784701,\n \
\ \"acc_norm\": 0.6877637130801688,\n \"acc_norm_stderr\": 0.03016513786784701\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.03210062154134986,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.03210062154134986\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041019,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041019\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.02645350805404033,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.02645350805404033\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598642,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.02642481659400985,\n\
\ \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.02642481659400985\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574877,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5751633986928104,\n \"acc_stderr\": 0.028304576673141103,\n\
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.028304576673141103\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5852090032154341,\n\
\ \"acc_stderr\": 0.027982680459759567,\n \"acc_norm\": 0.5852090032154341,\n\
\ \"acc_norm_stderr\": 0.027982680459759567\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.027648477877413327,\n\
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.027648477877413327\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3970013037809648,\n\
\ \"acc_stderr\": 0.012496346982909554,\n \"acc_norm\": 0.3970013037809648,\n\
\ \"acc_norm_stderr\": 0.012496346982909554\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.47875816993464054,\n \"acc_stderr\": 0.020209572388600265,\n \
\ \"acc_norm\": 0.47875816993464054,\n \"acc_norm_stderr\": 0.020209572388600265\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7134502923976608,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.7134502923976608,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.01662908751427678,\n \"mc2\": 0.506195531543039,\n\
\ \"mc2_stderr\": 0.01543396728769934\n }\n}\n```"
repo_url: https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v4.5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:28:50.240458.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:38:13.203801.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:38:13.203801.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:28:50.240458.parquet'
- split: 2023_08_29T22_38_13.203801
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:38:13.203801.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:38:13.203801.parquet'
- config_name: results
data_files:
- split: 2023_08_29T22_28_50.240458
path:
- results_2023-08-29T22:28:50.240458.parquet
- split: 2023_08_29T22_38_13.203801
path:
- results_2023-08-29T22:38:13.203801.parquet
- split: latest
path:
- results_2023-08-29T22:38:13.203801.parquet
---
# Dataset Card for Evaluation run of The-Face-Of-Goonery/Huginn-13b-v4.5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v4.5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [The-Face-Of-Goonery/Huginn-13b-v4.5](https://huggingface.co/The-Face-Of-Goonery/Huginn-13b-v4.5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v4.5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T22:38:13.203801](https://huggingface.co/datasets/open-llm-leaderboard/details_The-Face-Of-Goonery__Huginn-13b-v4.5/blob/main/results_2023-08-29T22%3A38%3A13.203801.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5259069198696034,
"acc_stderr": 0.034732780984597124,
"acc_norm": 0.5297450802692631,
"acc_norm_stderr": 0.03471265443522423,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.506195531543039,
"mc2_stderr": 0.01543396728769934
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693026
},
"harness|hellaswag|10": {
"acc": 0.6285600477992431,
"acc_stderr": 0.004822022254886021,
"acc_norm": 0.8234415455088627,
"acc_norm_stderr": 0.0038051533447130874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.04177578950739993,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.04177578950739993
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006715,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006715
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300645,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300645
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5210084033613446,
"acc_stderr": 0.03244980849990029,
"acc_norm": 0.5210084033613446,
"acc_norm_stderr": 0.03244980849990029
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.019453066609201597,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.019453066609201597
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.032282103870378914,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.032282103870378914
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6877637130801688,
"acc_stderr": 0.03016513786784701,
"acc_norm": 0.6877637130801688,
"acc_norm_stderr": 0.03016513786784701
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.03210062154134986,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.03210062154134986
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041019,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041019
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.02645350805404033,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.02645350805404033
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598642,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.02642481659400985,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.02642481659400985
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574877,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.028304576673141103,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.028304576673141103
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5852090032154341,
"acc_stderr": 0.027982680459759567,
"acc_norm": 0.5852090032154341,
"acc_norm_stderr": 0.027982680459759567
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.027648477877413327,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.027648477877413327
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3970013037809648,
"acc_stderr": 0.012496346982909554,
"acc_norm": 0.3970013037809648,
"acc_norm_stderr": 0.012496346982909554
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47875816993464054,
"acc_stderr": 0.020209572388600265,
"acc_norm": 0.47875816993464054,
"acc_norm_stderr": 0.020209572388600265
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7134502923976608,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.7134502923976608,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.01662908751427678,
"mc2": 0.506195531543039,
"mc2_stderr": 0.01543396728769934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Conan-Lao/github-issues | 2023-08-29T22:38:12.000Z | [
"region:us"
] | Conan-Lao | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: labels_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: description
dtype: string
- name: creator
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: open_issues
dtype: int64
- name: closed_issues
dtype: int64
- name: state
dtype: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: due_on
dtype: 'null'
- name: closed_at
dtype: 'null'
- name: comments
sequence: string
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 17609883
num_examples: 2500
download_size: 4977402
dataset_size: 17609883
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA | 2023-08-29T22:57:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T22:56:12.065154](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA/blob/main/results_2023-08-29T22%3A56%3A12.065154.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.438823528740988,\n\
\ \"acc_stderr\": 0.035260068155448576,\n \"acc_norm\": 0.44253606128507456,\n\
\ \"acc_norm_stderr\": 0.035246174415990414,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.4191863436208715,\n\
\ \"mc2_stderr\": 0.015793546690441883\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4880546075085324,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5204778156996587,\n \"acc_norm_stderr\": 0.01459913135303501\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6022704640509858,\n\
\ \"acc_stderr\": 0.004884287515461491,\n \"acc_norm\": 0.788886675960964,\n\
\ \"acc_norm_stderr\": 0.004072645874992222\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.43018867924528303,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.43018867924528303,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.023752928712112147,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.023752928712112147\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.47096774193548385,\n \"acc_stderr\": 0.028396016402761005,\n \"\
acc_norm\": 0.47096774193548385,\n \"acc_norm_stderr\": 0.028396016402761005\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3448275862068966,\n \"acc_stderr\": 0.033442837442804574,\n \"\
acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.033442837442804574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6632124352331606,\n \"acc_stderr\": 0.03410780251836184,\n\
\ \"acc_norm\": 0.6632124352331606,\n \"acc_norm_stderr\": 0.03410780251836184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4025641025641026,\n \"acc_stderr\": 0.024864995159767755,\n\
\ \"acc_norm\": 0.4025641025641026,\n \"acc_norm_stderr\": 0.024864995159767755\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823019,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823019\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36554621848739494,\n \"acc_stderr\": 0.0312821770636846,\n \
\ \"acc_norm\": 0.36554621848739494,\n \"acc_norm_stderr\": 0.0312821770636846\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6073394495412844,\n \"acc_stderr\": 0.020937505161201096,\n \"\
acc_norm\": 0.6073394495412844,\n \"acc_norm_stderr\": 0.020937505161201096\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012404,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012404\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5098039215686274,\n \"acc_stderr\": 0.03508637358630572,\n \"\
acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.03508637358630572\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5021097046413502,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.5021097046413502,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4798206278026906,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.4798206278026906,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4351145038167939,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.4351145038167939,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5785123966942148,\n \"acc_stderr\": 0.045077322787750874,\n \"\
acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.045077322787750874\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4049079754601227,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.4049079754601227,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976235,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976235\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5696040868454662,\n\
\ \"acc_stderr\": 0.017705868776292398,\n \"acc_norm\": 0.5696040868454662,\n\
\ \"acc_norm_stderr\": 0.017705868776292398\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4682080924855491,\n \"acc_stderr\": 0.026864624366756646,\n\
\ \"acc_norm\": 0.4682080924855491,\n \"acc_norm_stderr\": 0.026864624366756646\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2659217877094972,\n\
\ \"acc_stderr\": 0.014776765066438883,\n \"acc_norm\": 0.2659217877094972,\n\
\ \"acc_norm_stderr\": 0.014776765066438883\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.028629305194003543,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.028629305194003543\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5787781350482315,\n\
\ \"acc_stderr\": 0.02804339985821063,\n \"acc_norm\": 0.5787781350482315,\n\
\ \"acc_norm_stderr\": 0.02804339985821063\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.027801656212323667,\n\
\ \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.027801656212323667\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.02826765748265014,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.02826765748265014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32073011734028684,\n\
\ \"acc_stderr\": 0.011921199991782643,\n \"acc_norm\": 0.32073011734028684,\n\
\ \"acc_norm_stderr\": 0.011921199991782643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213528,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213528\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39869281045751637,\n \"acc_stderr\": 0.019808281317449848,\n \
\ \"acc_norm\": 0.39869281045751637,\n \"acc_norm_stderr\": 0.019808281317449848\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4818181818181818,\n\
\ \"acc_stderr\": 0.04785964010794917,\n \"acc_norm\": 0.4818181818181818,\n\
\ \"acc_norm_stderr\": 0.04785964010794917\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.0314147080258659,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.0314147080258659\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5671641791044776,\n\
\ \"acc_stderr\": 0.03503490923673282,\n \"acc_norm\": 0.5671641791044776,\n\
\ \"acc_norm_stderr\": 0.03503490923673282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.03740059382029321,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.03740059382029321\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237017,\n \"mc2\": 0.4191863436208715,\n\
\ \"mc2_stderr\": 0.015793546690441883\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T22:56:12.065154.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:56:12.065154.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T22:56:12.065154.parquet'
- config_name: results
data_files:
- split: 2023_08_29T22_56_12.065154
path:
- results_2023-08-29T22:56:12.065154.parquet
- split: latest
path:
- results_2023-08-29T22:56:12.065154.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v4-7B-QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T22:56:12.065154](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v4-7B-QLoRA/blob/main/results_2023-08-29T22%3A56%3A12.065154.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.438823528740988,
"acc_stderr": 0.035260068155448576,
"acc_norm": 0.44253606128507456,
"acc_norm_stderr": 0.035246174415990414,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.4191863436208715,
"mc2_stderr": 0.015793546690441883
},
"harness|arc:challenge|25": {
"acc": 0.4880546075085324,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5204778156996587,
"acc_norm_stderr": 0.01459913135303501
},
"harness|hellaswag|10": {
"acc": 0.6022704640509858,
"acc_stderr": 0.004884287515461491,
"acc_norm": 0.788886675960964,
"acc_norm_stderr": 0.004072645874992222
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.43018867924528303,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.43018867924528303,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.023752928712112147,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.023752928712112147
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.47096774193548385,
"acc_stderr": 0.028396016402761005,
"acc_norm": 0.47096774193548385,
"acc_norm_stderr": 0.028396016402761005
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6632124352331606,
"acc_stderr": 0.03410780251836184,
"acc_norm": 0.6632124352331606,
"acc_norm_stderr": 0.03410780251836184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4025641025641026,
"acc_stderr": 0.024864995159767755,
"acc_norm": 0.4025641025641026,
"acc_norm_stderr": 0.024864995159767755
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823019,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823019
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36554621848739494,
"acc_stderr": 0.0312821770636846,
"acc_norm": 0.36554621848739494,
"acc_norm_stderr": 0.0312821770636846
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6073394495412844,
"acc_stderr": 0.020937505161201096,
"acc_norm": 0.6073394495412844,
"acc_norm_stderr": 0.020937505161201096
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012404,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012404
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5021097046413502,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.5021097046413502,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4798206278026906,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.4798206278026906,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4351145038167939,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.4351145038167939,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.045077322787750874,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.045077322787750874
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4049079754601227,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.4049079754601227,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976235,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976235
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5696040868454662,
"acc_stderr": 0.017705868776292398,
"acc_norm": 0.5696040868454662,
"acc_norm_stderr": 0.017705868776292398
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.026864624366756646,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.026864624366756646
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2659217877094972,
"acc_stderr": 0.014776765066438883,
"acc_norm": 0.2659217877094972,
"acc_norm_stderr": 0.014776765066438883
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.028629305194003543,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.028629305194003543
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5787781350482315,
"acc_stderr": 0.02804339985821063,
"acc_norm": 0.5787781350482315,
"acc_norm_stderr": 0.02804339985821063
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.027801656212323667,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.027801656212323667
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.02826765748265014,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.02826765748265014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32073011734028684,
"acc_stderr": 0.011921199991782643,
"acc_norm": 0.32073011734028684,
"acc_norm_stderr": 0.011921199991782643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213528,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213528
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39869281045751637,
"acc_stderr": 0.019808281317449848,
"acc_norm": 0.39869281045751637,
"acc_norm_stderr": 0.019808281317449848
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4818181818181818,
"acc_stderr": 0.04785964010794917,
"acc_norm": 0.4818181818181818,
"acc_norm_stderr": 0.04785964010794917
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.0314147080258659,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.0314147080258659
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5671641791044776,
"acc_stderr": 0.03503490923673282,
"acc_norm": 0.5671641791044776,
"acc_norm_stderr": 0.03503490923673282
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.03740059382029321,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.03740059382029321
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237017,
"mc2": 0.4191863436208715,
"mc2_stderr": 0.015793546690441883
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
theblackcat102/crossvalidated-posts | 2023-08-30T00:04:10.000Z | [
"task_categories:question-answering",
"task_categories:text-generation",
"task_categories:text2text-generation",
"language:code",
"language:en",
"code",
"region:us"
] | theblackcat102 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Id
dtype: string
- name: PostTypeId
dtype: string
- name: AcceptedAnswerId
dtype: string
- name: ParentId
dtype: string
- name: Score
dtype: string
- name: ViewCount
dtype: string
- name: Body
dtype: string
- name: Title
dtype: string
- name: ContentLicense
dtype: string
- name: FavoriteCount
dtype: string
- name: CreationDate
dtype: string
- name: LastActivityDate
dtype: string
- name: LastEditDate
dtype: string
- name: LastEditorUserId
dtype: string
- name: OwnerUserId
dtype: string
- name: Tags
sequence: string
splits:
- name: train
num_bytes: 566804417
num_examples: 411232
download_size: 311064786
dataset_size: 566804417
language:
- code
- en
task_categories:
- question-answering
- text-generation
- text2text-generation
tags:
- code
---
# Cross Validated / stats.stackexchange.com
## Dataset Summary
This dataset contains all posts submitted to stats.stackexchange.com before the 30th of August 2023 formatted as **Markdown text**.<br>
The data is sourced from [Internet Archive StackExchange Data Dump](https://archive.org/download/stackexchange) and follows the format by [mikex86/stackoverflow-posts](https://huggingface.co/datasets/mikex86/stackoverflow-posts)
## Dataset Structure
Each record corresponds to one post of a particular type.
Original ordering from the data dump is not exactly preserved due to parallelism in the script used to process the data dump.
The markdown content of each post is contained in the `Body` field. The license for a particular post is contained in the `ContentLicense` field.
### Data Fields
```typescript
{
Id: long,
PostTypeId: long, // 1=Question, 2=Answer, 3=Orphaned tag wiki, 4=Tag wiki excerpt, 5=Tag wiki, 6=Moderator nomination, 7=Wiki Placeholder, 8=Privilige Wiki
AcceptedAnswerId: long | null, // only present if PostTypeId=1
ParentId: long | null, // only present if PostTypeId=2
Score: long,
ViewCount: long | null,
Body: string | null,
Title: string | null,
ContentLicense: string | null,
FavoriteCount: long | null,
CreationDate: string | null,
LastActivityDate: string | null,
LastEditDate: string | null,
LastEditorUserId: long | null,
OwnerUserId: long | null,
Tags: array<string> | null
}
```
Also consider the [StackExchange Datadump Schema Documentation](https://meta.stackexchange.com/questions/2677/database-schema-documentation-for-the-public-data-dump-and-sede), as all fields
have analogs in the original dump format.
## How to use?
```python
from datasets import load_dataset
# predownload full dataset
ds = load_dataset('theblackcat102/crossvalidated-posts', split='train')
# dataset streaming (will only download the data as needed)
ds = load_dataset('theblackcat102/crossvalidated-posts', split='train', streaming=True)
for sample in iter(ds): print(sample["Body"])
```
## How is the text stored?
The original Data Dump formats the "Body" field as HTML, using tags such as `<code>`, `<h1>`, `<ul>`, etc.
This HTML format has been converted to Markdown following [mikex86/stackoverflow-posts](https://huggingface.co/datasets/mikex86/stackoverflow-posts) conversion rule.
**Example:**
After differencing I saw that my constant/intercept is not statistically significant. Does anybody know how to fit the same model without the const term?
im using statsmodels.tsa.arima.model
To give a relative example I have: `ARIMA(data, order=(3,0,0))` an AR(3) model and say it that the second coefficient is insignificant. I can get rid of it by typing
```
ARMA(data,order=([1, 3], 0, 0)
```
but how can I get rid of coefficient??
|
yzhuang/autotree_automl_Diabetes130US_gosdt_l512_d3_sd1 | 2023-08-30T00:17:25.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 487895484
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_Diabetes130US_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mango19918/aimodels | 2023-10-05T19:28:40.000Z | [
"license:openrail",
"region:us"
] | mango19918 | null | null | null | 0 | 0 | ---
license: openrail
---
|
griffin/baseline_summarization | 2023-08-30T00:23:56.000Z | [
"region:us"
] | griffin | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 4713630
num_examples: 1000
download_size: 2784742
dataset_size: 4713630
---
# Dataset Card for "baseline_summarization"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch | 2023-08-30T01:03:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/Luban-Platypus2-13B-QLora-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Luban-Platypus2-13B-QLora-0.80-epoch](https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T01:02:30.667173](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch/blob/main/results_2023-08-30T01%3A02%3A30.667173.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5809459434164882,\n\
\ \"acc_stderr\": 0.0341259980988452,\n \"acc_norm\": 0.5848222325630145,\n\
\ \"acc_norm_stderr\": 0.034106187006088,\n \"mc1\": 0.38310893512851896,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5525986510332871,\n\
\ \"mc2_stderr\": 0.01577560515591045\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.01444188962746439,\n\
\ \"acc_norm\": 0.6023890784982935,\n \"acc_norm_stderr\": 0.014301752223279535\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6199960167297351,\n\
\ \"acc_stderr\": 0.004843954338451441,\n \"acc_norm\": 0.8222465644293966,\n\
\ \"acc_norm_stderr\": 0.0038152372699611016\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014499,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336937,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336937\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342668,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342668\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.027104826328100944,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.027104826328100944\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270285,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270285\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647078,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647078\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7798165137614679,\n \"acc_stderr\": 0.01776597865232755,\n \"\
acc_norm\": 0.7798165137614679,\n \"acc_norm_stderr\": 0.01776597865232755\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159263,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159263\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124658,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124658\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.02718449890994161,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.02718449890994161\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n\
\ \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n\
\ \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603753,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603753\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580212,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580212\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767105,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38310893512851896,\n\
\ \"mc1_stderr\": 0.017018461679389855,\n \"mc2\": 0.5525986510332871,\n\
\ \"mc2_stderr\": 0.01577560515591045\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|arc:challenge|25_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hellaswag|10_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T01:02:30.667173.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T01:02:30.667173.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T01:02:30.667173.parquet'
- config_name: results
data_files:
- split: 2023_08_30T01_02_30.667173
path:
- results_2023-08-30T01:02:30.667173.parquet
- split: latest
path:
- results_2023-08-30T01:02:30.667173.parquet
---
# Dataset Card for Evaluation run of TFLai/Luban-Platypus2-13B-QLora-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Luban-Platypus2-13B-QLora-0.80-epoch](https://huggingface.co/TFLai/Luban-Platypus2-13B-QLora-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T01:02:30.667173](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Luban-Platypus2-13B-QLora-0.80-epoch/blob/main/results_2023-08-30T01%3A02%3A30.667173.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5809459434164882,
"acc_stderr": 0.0341259980988452,
"acc_norm": 0.5848222325630145,
"acc_norm_stderr": 0.034106187006088,
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5525986510332871,
"mc2_stderr": 0.01577560515591045
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.01444188962746439,
"acc_norm": 0.6023890784982935,
"acc_norm_stderr": 0.014301752223279535
},
"harness|hellaswag|10": {
"acc": 0.6199960167297351,
"acc_stderr": 0.004843954338451441,
"acc_norm": 0.8222465644293966,
"acc_norm_stderr": 0.0038152372699611016
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014499,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336937,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336937
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342668,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342668
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.027104826328100944,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.027104826328100944
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270285,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270285
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647078,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647078
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7798165137614679,
"acc_stderr": 0.01776597865232755,
"acc_norm": 0.7798165137614679,
"acc_norm_stderr": 0.01776597865232755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124658,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124658
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.02718449890994161,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.02718449890994161
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603753,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580212,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580212
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767105,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38310893512851896,
"mc1_stderr": 0.017018461679389855,
"mc2": 0.5525986510332871,
"mc2_stderr": 0.01577560515591045
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sjw/data.csv | 2023-08-31T23:02:11.000Z | [
"region:us"
] | sjw | null | null | null | 0 | 0 | |
syedhuq/newXYZ | 2023-08-30T13:50:52.000Z | [
"license:llama2",
"region:us"
] | syedhuq | null | null | null | 0 | 0 | ---
license: llama2
---
|
Laoganbaicai/ipynb | 2023-09-15T07:30:37.000Z | [
"license:openrail",
"region:us"
] | Laoganbaicai | null | null | null | 0 | 0 | ---
license: openrail
---
|
alkzar90/product-descriptions | 2023-08-30T01:24:31.000Z | [
"license:mit",
"region:us"
] | alkzar90 | null | null | null | 0 | 0 | ---
license: mit
---
|
Linhz/qg_viquad_2samples | 2023-08-30T01:46:47.000Z | [
"region:us"
] | Linhz | null | null | null | 0 | 0 | Entry not found |
Jackson428/emotion-c | 2023-08-30T01:49:26.000Z | [
"size_categories:n<1K",
"rlfh",
"argilla",
"human-feedback",
"region:us"
] | Jackson428 | null | null | null | 0 | 0 | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for emotion-c
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("Jackson428/emotion-c")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("Jackson428/emotion-c")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/guides/llms/conceptual_guides/data_model.html) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| text | Text | TextField | True | False |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, single choice, or multiple choice.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| sentiment | Sentiment | LabelQuestion | True | N/A | ['positive', 'neutral', 'negative'] |
| mixed-emotion | Mixed-emotion | MultiLabelQuestion | True | N/A | ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'] |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"fields": {
"text": "i didnt feel humiliated"
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": "041dc67d-3cf4-44b0-9f60-818a63bfbb52",
"values": {
"mixed-emotion": {
"value": [
"surprise"
]
},
"sentiment": {
"value": "positive"
}
}
}
],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"external_id": null,
"metadata": "{}",
"mixed-emotion": [
{
"status": "submitted",
"user_id": "041dc67d-3cf4-44b0-9f60-818a63bfbb52",
"value": [
"surprise"
]
}
],
"mixed-emotion-suggestion": null,
"mixed-emotion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"sentiment": [
{
"status": "submitted",
"user_id": "041dc67d-3cf4-44b0-9f60-818a63bfbb52",
"value": "positive"
}
],
"sentiment-suggestion": null,
"sentiment-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"text": "i didnt feel humiliated"
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **text** is of type `TextField`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **sentiment** is of type `LabelQuestion` with the following allowed values ['positive', 'neutral', 'negative'].
* **mixed-emotion** is of type `MultiLabelQuestion` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **sentiment-suggestion** is of type `label_selection` with the following allowed values ['positive', 'neutral', 'negative'].
* (optional) **mixed-emotion-suggestion** is of type `multi_label_selection` with the following allowed values ['joy', 'anger', 'sadness', 'fear', 'surprise', 'love'].
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
Emotion is a dataset of English Twitter messages with six basic emotions: anger, fear, joy, love, sadness, and surprise.
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat | 2023-09-17T21:50:30.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Norquinal/llama-2-7b-claude-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Norquinal/llama-2-7b-claude-chat](https://huggingface.co/Norquinal/llama-2-7b-claude-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T21:50:18.954049](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat/blob/main/results_2023-09-17T21-50-18.954049.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n\
\ \"em_stderr\": 0.00041913301788269335,\n \"f1\": 0.0588590604026846,\n\
\ \"f1_stderr\": 0.0013399207626085948,\n \"acc\": 0.4131723645607008,\n\
\ \"acc_stderr\": 0.009771744871869251\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788269335,\n\
\ \"f1\": 0.0588590604026846,\n \"f1_stderr\": 0.0013399207626085948\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \
\ \"acc_stderr\": 0.00735771352322235\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516153\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Norquinal/llama-2-7b-claude-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T21_50_18.954049
path:
- '**/details_harness|drop|3_2023-09-17T21-50-18.954049.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T21-50-18.954049.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T21_50_18.954049
path:
- '**/details_harness|gsm8k|5_2023-09-17T21-50-18.954049.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T21-50-18.954049.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:05:21.068223.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:05:21.068223.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:05:21.068223.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T21_50_18.954049
path:
- '**/details_harness|winogrande|5_2023-09-17T21-50-18.954049.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T21-50-18.954049.parquet'
- config_name: results
data_files:
- split: 2023_08_30T02_05_21.068223
path:
- results_2023-08-30T02:05:21.068223.parquet
- split: 2023_09_17T21_50_18.954049
path:
- results_2023-09-17T21-50-18.954049.parquet
- split: latest
path:
- results_2023-09-17T21-50-18.954049.parquet
---
# Dataset Card for Evaluation run of Norquinal/llama-2-7b-claude-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Norquinal/llama-2-7b-claude-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Norquinal/llama-2-7b-claude-chat](https://huggingface.co/Norquinal/llama-2-7b-claude-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T21:50:18.954049](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat/blob/main/results_2023-09-17T21-50-18.954049.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269335,
"f1": 0.0588590604026846,
"f1_stderr": 0.0013399207626085948,
"acc": 0.4131723645607008,
"acc_stderr": 0.009771744871869251
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269335,
"f1": 0.0588590604026846,
"f1_stderr": 0.0013399207626085948
},
"harness|gsm8k|5": {
"acc": 0.07733131159969674,
"acc_stderr": 0.00735771352322235
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.012185776220516153
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat-rp | 2023-09-17T21:52:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Norquinal/llama-2-7b-claude-chat-rp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Norquinal/llama-2-7b-claude-chat-rp](https://huggingface.co/Norquinal/llama-2-7b-claude-chat-rp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat-rp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T21:52:14.967446](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat-rp/blob/main/results_2023-09-17T21-52-14.967446.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.00040584511324177306,\n \"f1\": 0.05937500000000002,\n\
\ \"f1_stderr\": 0.0013421305212247912,\n \"acc\": 0.4101086482368971,\n\
\ \"acc_stderr\": 0.009683376605280784\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177306,\n\
\ \"f1\": 0.05937500000000002,\n \"f1_stderr\": 0.0013421305212247912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07278241091736164,\n \
\ \"acc_stderr\": 0.007155604761167465\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Norquinal/llama-2-7b-claude-chat-rp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T21_52_14.967446
path:
- '**/details_harness|drop|3_2023-09-17T21-52-14.967446.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T21-52-14.967446.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T21_52_14.967446
path:
- '**/details_harness|gsm8k|5_2023-09-17T21-52-14.967446.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T21-52-14.967446.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:07:51.565435.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:07:51.565435.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:07:51.565435.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T21_52_14.967446
path:
- '**/details_harness|winogrande|5_2023-09-17T21-52-14.967446.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T21-52-14.967446.parquet'
- config_name: results
data_files:
- split: 2023_08_30T02_07_51.565435
path:
- results_2023-08-30T02:07:51.565435.parquet
- split: 2023_09_17T21_52_14.967446
path:
- results_2023-09-17T21-52-14.967446.parquet
- split: latest
path:
- results_2023-09-17T21-52-14.967446.parquet
---
# Dataset Card for Evaluation run of Norquinal/llama-2-7b-claude-chat-rp
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Norquinal/llama-2-7b-claude-chat-rp
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Norquinal/llama-2-7b-claude-chat-rp](https://huggingface.co/Norquinal/llama-2-7b-claude-chat-rp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat-rp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T21:52:14.967446](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__llama-2-7b-claude-chat-rp/blob/main/results_2023-09-17T21-52-14.967446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177306,
"f1": 0.05937500000000002,
"f1_stderr": 0.0013421305212247912,
"acc": 0.4101086482368971,
"acc_stderr": 0.009683376605280784
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177306,
"f1": 0.05937500000000002,
"f1_stderr": 0.0013421305212247912
},
"harness|gsm8k|5": {
"acc": 0.07278241091736164,
"acc_stderr": 0.007155604761167465
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
xixiyyds/config_store | 2023-08-30T02:12:32.000Z | [
"region:us"
] | xixiyyds | null | null | null | 0 | 0 | Entry not found |
AzulkerAI/Cricky | 2023-08-30T02:32:29.000Z | [
"region:us"
] | AzulkerAI | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test | 2023-08-30T02:40:54.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-7b-dolphin_10w-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-7b-dolphin_10w-test](https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T02:39:30.336527](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test/blob/main/results_2023-08-30T02%3A39%3A30.336527.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.441355572237513,\n\
\ \"acc_stderr\": 0.035230208768426426,\n \"acc_norm\": 0.44554035341923276,\n\
\ \"acc_norm_stderr\": 0.03522008237001183,\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.01552856663708728,\n \"mc2\": 0.42075433716694316,\n\
\ \"mc2_stderr\": 0.014621475505823845\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4684300341296928,\n \"acc_stderr\": 0.014582236460866977,\n\
\ \"acc_norm\": 0.5170648464163823,\n \"acc_norm_stderr\": 0.014602878388536597\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5467038438558056,\n\
\ \"acc_stderr\": 0.004967965810199991,\n \"acc_norm\": 0.7449711212905795,\n\
\ \"acc_norm_stderr\": 0.004349866376068983\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.42,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.0376574669386515,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.0376574669386515\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416542,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416542\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4085106382978723,\n \"acc_stderr\": 0.03213418026701576,\n\
\ \"acc_norm\": 0.4085106382978723,\n \"acc_norm_stderr\": 0.03213418026701576\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.02286083830923207,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.02286083830923207\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.45483870967741935,\n\
\ \"acc_stderr\": 0.028327743091561067,\n \"acc_norm\": 0.45483870967741935,\n\
\ \"acc_norm_stderr\": 0.028327743091561067\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03255086769970103,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03255086769970103\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5575757575757576,\n \"acc_stderr\": 0.03878372113711274,\n\
\ \"acc_norm\": 0.5575757575757576,\n \"acc_norm_stderr\": 0.03878372113711274\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5404040404040404,\n \"acc_stderr\": 0.035507024651313425,\n \"\
acc_norm\": 0.5404040404040404,\n \"acc_norm_stderr\": 0.035507024651313425\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5958549222797928,\n \"acc_stderr\": 0.0354150857888402,\n\
\ \"acc_norm\": 0.5958549222797928,\n \"acc_norm_stderr\": 0.0354150857888402\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4153846153846154,\n \"acc_stderr\": 0.024985354923102325,\n\
\ \"acc_norm\": 0.4153846153846154,\n \"acc_norm_stderr\": 0.024985354923102325\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.37815126050420167,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.37815126050420167,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5853211009174312,\n \"acc_stderr\": 0.02112290320860259,\n \"\
acc_norm\": 0.5853211009174312,\n \"acc_norm_stderr\": 0.02112290320860259\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5637254901960784,\n \"acc_stderr\": 0.03480693138457039,\n \"\
acc_norm\": 0.5637254901960784,\n \"acc_norm_stderr\": 0.03480693138457039\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.030685820596610805,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.030685820596610805\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.43946188340807174,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.43946188340807174,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03088273697413866,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03088273697413866\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6079182630906769,\n\
\ \"acc_stderr\": 0.017458524050147636,\n \"acc_norm\": 0.6079182630906769,\n\
\ \"acc_norm_stderr\": 0.017458524050147636\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.026788811931562764,\n\
\ \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.026788811931562764\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017763,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42483660130718953,\n \"acc_stderr\": 0.02830457667314112,\n\
\ \"acc_norm\": 0.42483660130718953,\n \"acc_norm_stderr\": 0.02830457667314112\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.49517684887459806,\n\
\ \"acc_stderr\": 0.028396770444111298,\n \"acc_norm\": 0.49517684887459806,\n\
\ \"acc_norm_stderr\": 0.028396770444111298\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.027786800931427443,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.027786800931427443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.0286638201471995,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.0286638201471995\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.33833116036505867,\n\
\ \"acc_stderr\": 0.012084265626344194,\n \"acc_norm\": 0.33833116036505867,\n\
\ \"acc_norm_stderr\": 0.012084265626344194\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.02985526139348392,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.02985526139348392\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4264705882352941,\n \"acc_stderr\": 0.02000791273935936,\n \
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.02000791273935936\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.45714285714285713,\n \"acc_stderr\": 0.03189141832421397,\n\
\ \"acc_norm\": 0.45714285714285713,\n \"acc_norm_stderr\": 0.03189141832421397\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5074626865671642,\n\
\ \"acc_stderr\": 0.03535140084276719,\n \"acc_norm\": 0.5074626865671642,\n\
\ \"acc_norm_stderr\": 0.03535140084276719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6140350877192983,\n \"acc_stderr\": 0.03733756969066164,\n\
\ \"acc_norm\": 0.6140350877192983,\n \"acc_norm_stderr\": 0.03733756969066164\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2692778457772338,\n\
\ \"mc1_stderr\": 0.01552856663708728,\n \"mc2\": 0.42075433716694316,\n\
\ \"mc2_stderr\": 0.014621475505823845\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T02:39:30.336527.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:39:30.336527.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T02:39:30.336527.parquet'
- config_name: results
data_files:
- split: 2023_08_30T02_39_30.336527
path:
- results_2023-08-30T02:39:30.336527.parquet
- split: latest
path:
- results_2023-08-30T02:39:30.336527.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-7b-dolphin_10w-test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-7b-dolphin_10w-test](https://huggingface.co/CHIH-HUNG/llama-2-7b-dolphin_10w-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T02:39:30.336527](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-7b-dolphin_10w-test/blob/main/results_2023-08-30T02%3A39%3A30.336527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.441355572237513,
"acc_stderr": 0.035230208768426426,
"acc_norm": 0.44554035341923276,
"acc_norm_stderr": 0.03522008237001183,
"mc1": 0.2692778457772338,
"mc1_stderr": 0.01552856663708728,
"mc2": 0.42075433716694316,
"mc2_stderr": 0.014621475505823845
},
"harness|arc:challenge|25": {
"acc": 0.4684300341296928,
"acc_stderr": 0.014582236460866977,
"acc_norm": 0.5170648464163823,
"acc_norm_stderr": 0.014602878388536597
},
"harness|hellaswag|10": {
"acc": 0.5467038438558056,
"acc_stderr": 0.004967965810199991,
"acc_norm": 0.7449711212905795,
"acc_norm_stderr": 0.004349866376068983
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.0376574669386515,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.0376574669386515
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416542,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416542
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4085106382978723,
"acc_stderr": 0.03213418026701576,
"acc_norm": 0.4085106382978723,
"acc_norm_stderr": 0.03213418026701576
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.02286083830923207,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.02286083830923207
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.45483870967741935,
"acc_stderr": 0.028327743091561067,
"acc_norm": 0.45483870967741935,
"acc_norm_stderr": 0.028327743091561067
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03255086769970103,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03255086769970103
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5575757575757576,
"acc_stderr": 0.03878372113711274,
"acc_norm": 0.5575757575757576,
"acc_norm_stderr": 0.03878372113711274
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5404040404040404,
"acc_stderr": 0.035507024651313425,
"acc_norm": 0.5404040404040404,
"acc_norm_stderr": 0.035507024651313425
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5958549222797928,
"acc_stderr": 0.0354150857888402,
"acc_norm": 0.5958549222797928,
"acc_norm_stderr": 0.0354150857888402
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4153846153846154,
"acc_stderr": 0.024985354923102325,
"acc_norm": 0.4153846153846154,
"acc_norm_stderr": 0.024985354923102325
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.37815126050420167,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.37815126050420167,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5853211009174312,
"acc_stderr": 0.02112290320860259,
"acc_norm": 0.5853211009174312,
"acc_norm_stderr": 0.02112290320860259
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5637254901960784,
"acc_stderr": 0.03480693138457039,
"acc_norm": 0.5637254901960784,
"acc_norm_stderr": 0.03480693138457039
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.030685820596610805,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.030685820596610805
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.43946188340807174,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.43946188340807174,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4171779141104294,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.4171779141104294,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03088273697413866,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03088273697413866
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6079182630906769,
"acc_stderr": 0.017458524050147636,
"acc_norm": 0.6079182630906769,
"acc_norm_stderr": 0.017458524050147636
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.026788811931562764,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.026788811931562764
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017763,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42483660130718953,
"acc_stderr": 0.02830457667314112,
"acc_norm": 0.42483660130718953,
"acc_norm_stderr": 0.02830457667314112
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.49517684887459806,
"acc_stderr": 0.028396770444111298,
"acc_norm": 0.49517684887459806,
"acc_norm_stderr": 0.028396770444111298
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.027786800931427443,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.027786800931427443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.0286638201471995,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.0286638201471995
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.33833116036505867,
"acc_stderr": 0.012084265626344194,
"acc_norm": 0.33833116036505867,
"acc_norm_stderr": 0.012084265626344194
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.02985526139348392,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.02985526139348392
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.02000791273935936,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.02000791273935936
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.45714285714285713,
"acc_stderr": 0.03189141832421397,
"acc_norm": 0.45714285714285713,
"acc_norm_stderr": 0.03189141832421397
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5074626865671642,
"acc_stderr": 0.03535140084276719,
"acc_norm": 0.5074626865671642,
"acc_norm_stderr": 0.03535140084276719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.03733756969066164,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.03733756969066164
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2692778457772338,
"mc1_stderr": 0.01552856663708728,
"mc2": 0.42075433716694316,
"mc2_stderr": 0.014621475505823845
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Chess1/Chess1 | 2023-08-30T03:15:20.000Z | [
"region:us"
] | Chess1 | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://hu ggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
chan2333/sd | 2023-09-11T07:38:28.000Z | [
"region:us"
] | chan2333 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA | 2023-08-30T03:09:59.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA](https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T03:08:37.403827](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA/blob/main/results_2023-08-30T03%3A08%3A37.403827.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.7016950821019889,\n \"\
acc_stderr\": 0.03100773424505602,\n \"acc_norm\": 0.7055688798324372,\n\
\ \"acc_norm_stderr\": 0.030976198338743925,\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6337134354987094,\n\
\ \"mc2_stderr\": 0.014897273290786066\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6834470989761092,\n \"acc_stderr\": 0.01359243151906808,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6881099382593109,\n\
\ \"acc_stderr\": 0.004623184227344766,\n \"acc_norm\": 0.877414857598088,\n\
\ \"acc_norm_stderr\": 0.0032729014349397656\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n\
\ \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.031674733837957166,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.031674733837957166\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6851063829787234,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.6851063829787234,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n\
\ \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130723,\n \"\
acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130723\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.022331707611823074,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.022331707611823074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"\
acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465946,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465946\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863814,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863814\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5925925925925926,\n \"acc_stderr\": 0.033509916046960436,\n \"\
acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.033509916046960436\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744632,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744632\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035196,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035196\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.03004735765580663,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.03004735765580663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499978,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499978\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7716763005780347,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.7716763005780347,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5843575418994413,\n\
\ \"acc_stderr\": 0.016482782187500683,\n \"acc_norm\": 0.5843575418994413,\n\
\ \"acc_norm_stderr\": 0.016482782187500683\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157382,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157382\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.029555454236778838,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.029555454236778838\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5710560625814863,\n\
\ \"acc_stderr\": 0.012640625443067365,\n \"acc_norm\": 0.5710560625814863,\n\
\ \"acc_norm_stderr\": 0.012640625443067365\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114948,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114948\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.761437908496732,\n \"acc_stderr\": 0.01724238582877962,\n \
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.01724238582877962\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.02560737598657916,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.02560737598657916\n },\n\
\ \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4528763769889841,\n\
\ \"mc1_stderr\": 0.01742558984831402,\n \"mc2\": 0.6337134354987094,\n\
\ \"mc2_stderr\": 0.014897273290786066\n }\n}\n```"
repo_url: https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|arc:challenge|25_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hellaswag|10_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:08:37.403827.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T03:08:37.403827.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T03:08:37.403827.parquet'
- config_name: results
data_files:
- split: 2023_08_30T03_08_37.403827
path:
- results_2023-08-30T03:08:37.403827.parquet
- split: latest
path:
- results_2023-08-30T03:08:37.403827.parquet
---
# Dataset Card for Evaluation run of fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA](https://huggingface.co/fangloveskari/Dolphin_ORCA_LLaMA_70b_QLoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T03:08:37.403827](https://huggingface.co/datasets/open-llm-leaderboard/details_fangloveskari__Dolphin_ORCA_LLaMA_70b_QLoRA/blob/main/results_2023-08-30T03%3A08%3A37.403827.json):
```python
{
"all": {
"acc": 0.7016950821019889,
"acc_stderr": 0.03100773424505602,
"acc_norm": 0.7055688798324372,
"acc_norm_stderr": 0.030976198338743925,
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6337134354987094,
"mc2_stderr": 0.014897273290786066
},
"harness|arc:challenge|25": {
"acc": 0.6834470989761092,
"acc_stderr": 0.01359243151906808,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.6881099382593109,
"acc_stderr": 0.004623184227344766,
"acc_norm": 0.877414857598088,
"acc_norm_stderr": 0.0032729014349397656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.031674733837957166,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.031674733837957166
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6851063829787234,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.6851063829787234,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130723,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130723
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.022331707611823074,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.022331707611823074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465946,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465946
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863814,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863814
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.033509916046960436,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.033509916046960436
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744632,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744632
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035196,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035196
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.03004735765580663,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.03004735765580663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499978,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7716763005780347,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.7716763005780347,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5843575418994413,
"acc_stderr": 0.016482782187500683,
"acc_norm": 0.5843575418994413,
"acc_norm_stderr": 0.016482782187500683
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157382,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157382
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.029555454236778838,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.029555454236778838
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5710560625814863,
"acc_stderr": 0.012640625443067365,
"acc_norm": 0.5710560625814863,
"acc_norm_stderr": 0.012640625443067365
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114948,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114948
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.01724238582877962,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.01724238582877962
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8,
"acc_stderr": 0.02560737598657916,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02560737598657916
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4528763769889841,
"mc1_stderr": 0.01742558984831402,
"mc2": 0.6337134354987094,
"mc2_stderr": 0.014897273290786066
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
eunyounglee/rm_data_squad_3 | 2023-08-30T03:24:10.000Z | [
"region:us"
] | eunyounglee | null | null | null | 0 | 0 | Entry not found |
littlepan07/Ts_12par_text | 2023-08-30T18:04:22.000Z | [
"region:us"
] | littlepan07 | null | null | null | 0 | 0 | Entry not found |
atyka/pinky | 2023-08-30T04:32:04.000Z | [
"region:us"
] | atyka | null | null | null | 0 | 0 | Entry not found |
bismillah123/alhamdulillah | 2023-08-30T04:33:21.000Z | [
"region:us"
] | bismillah123 | null | null | null | 0 | 0 | Entry not found |
qazisaad/llama_2_product_titles-esci_train | 2023-08-30T05:36:48.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: pos
path: data/pos-*
- split: neg
path: data/neg-*
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
splits:
- name: pos
num_bytes: 3339196
num_examples: 1564
- name: neg
num_bytes: 1753947
num_examples: 977
download_size: 542387
dataset_size: 5093143
---
# Dataset Card for "llama_2_product_titles-esci_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
silumanajag/situmang | 2023-08-30T04:25:58.000Z | [
"region:us"
] | silumanajag | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_anas-awadalla__mpt-7b | 2023-09-17T09:31:02.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of anas-awadalla/mpt-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [anas-awadalla/mpt-7b](https://huggingface.co/anas-awadalla/mpt-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_anas-awadalla__mpt-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T09:30:50.609279](https://huggingface.co/datasets/open-llm-leaderboard/details_anas-awadalla__mpt-7b/blob/main/results_2023-09-17T09-30-50.609279.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0006291946308724832,\n\
\ \"em_stderr\": 0.0002568002749724036,\n \"f1\": 0.055483431208053824,\n\
\ \"f1_stderr\": 0.0012896726370180557,\n \"acc\": 0.38078553207836646,\n\
\ \"acc_stderr\": 0.009004668193232201\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0006291946308724832,\n \"em_stderr\": 0.0002568002749724036,\n\
\ \"f1\": 0.055483431208053824,\n \"f1_stderr\": 0.0012896726370180557\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \
\ \"acc_stderr\": 0.005409439736970527\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7213891081294396,\n \"acc_stderr\": 0.012599896649493876\n\
\ }\n}\n```"
repo_url: https://huggingface.co/anas-awadalla/mpt-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|arc:challenge|25_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T09_30_50.609279
path:
- '**/details_harness|drop|3_2023-09-17T09-30-50.609279.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T09-30-50.609279.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T09_30_50.609279
path:
- '**/details_harness|gsm8k|5_2023-09-17T09-30-50.609279.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T09-30-50.609279.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hellaswag|10_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:55:09.041591.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T03:55:09.041591.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T03:55:09.041591.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T09_30_50.609279
path:
- '**/details_harness|winogrande|5_2023-09-17T09-30-50.609279.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T09-30-50.609279.parquet'
- config_name: results
data_files:
- split: 2023_08_30T03_55_09.041591
path:
- results_2023-08-30T03:55:09.041591.parquet
- split: 2023_09_17T09_30_50.609279
path:
- results_2023-09-17T09-30-50.609279.parquet
- split: latest
path:
- results_2023-09-17T09-30-50.609279.parquet
---
# Dataset Card for Evaluation run of anas-awadalla/mpt-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/anas-awadalla/mpt-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [anas-awadalla/mpt-7b](https://huggingface.co/anas-awadalla/mpt-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_anas-awadalla__mpt-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T09:30:50.609279](https://huggingface.co/datasets/open-llm-leaderboard/details_anas-awadalla__mpt-7b/blob/main/results_2023-09-17T09-30-50.609279.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749724036,
"f1": 0.055483431208053824,
"f1_stderr": 0.0012896726370180557,
"acc": 0.38078553207836646,
"acc_stderr": 0.009004668193232201
},
"harness|drop|3": {
"em": 0.0006291946308724832,
"em_stderr": 0.0002568002749724036,
"f1": 0.055483431208053824,
"f1_stderr": 0.0012896726370180557
},
"harness|gsm8k|5": {
"acc": 0.0401819560272934,
"acc_stderr": 0.005409439736970527
},
"harness|winogrande|5": {
"acc": 0.7213891081294396,
"acc_stderr": 0.012599896649493876
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
qazisaad/llama_2_product_titles-esci_train-temp | 2023-08-30T04:04:43.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
splits:
- name: train
num_bytes: 2985474
num_examples: 1564
download_size: 325190
dataset_size: 2985474
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_product_titles-esci_train-temp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
xuqinyang/x | 2023-08-30T04:10:34.000Z | [
"region:us"
] | xuqinyang | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ | 2023-08-31T06:45:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/WizardLM-70B-V1.0-GPTQ
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/WizardLM-70B-V1.0-GPTQ](https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T06:45:23.824442](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ/blob/main/results_2023-08-31T06%3A45%3A23.824442.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.636465710040601,\n\
\ \"acc_stderr\": 0.03280341903722105,\n \"acc_norm\": 0.6402033295604549,\n\
\ \"acc_norm_stderr\": 0.03278106024809485,\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5454276049890074,\n\
\ \"mc2_stderr\": 0.015570490235725166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670722,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038076\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6486755626369249,\n\
\ \"acc_stderr\": 0.004764084597176899,\n \"acc_norm\": 0.838478390758813,\n\
\ \"acc_norm_stderr\": 0.00367259272936363\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.02914690474779833,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.02914690474779833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944413,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944413\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950357,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644234,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606647,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606647\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092448,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092448\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318667,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318667\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508773,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508773\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097655,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097655\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464085,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464085\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35195530726256985,\n\
\ \"acc_stderr\": 0.015972668523689074,\n \"acc_norm\": 0.35195530726256985,\n\
\ \"acc_norm_stderr\": 0.015972668523689074\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5097783572359843,\n\
\ \"acc_stderr\": 0.012767793787729338,\n \"acc_norm\": 0.5097783572359843,\n\
\ \"acc_norm_stderr\": 0.012767793787729338\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.02997280717046462,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.02997280717046462\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083376,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.02721283588407315,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.02721283588407315\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\
\ \"mc1_stderr\": 0.017008101939163495,\n \"mc2\": 0.5454276049890074,\n\
\ \"mc2_stderr\": 0.015570490235725166\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|arc:challenge|25_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hellaswag|10_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T04:09:44.501834.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:45:23.824442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T06:45:23.824442.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T04:09:44.501834.parquet'
- split: 2023_08_31T06_45_23.824442
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:45:23.824442.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T06:45:23.824442.parquet'
- config_name: results
data_files:
- split: 2023_08_30T04_09_44.501834
path:
- results_2023-08-30T04:09:44.501834.parquet
- split: 2023_08_31T06_45_23.824442
path:
- results_2023-08-31T06:45:23.824442.parquet
- split: latest
path:
- results_2023-08-31T06:45:23.824442.parquet
---
# Dataset Card for Evaluation run of TheBloke/WizardLM-70B-V1.0-GPTQ
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-70B-V1.0-GPTQ](https://huggingface.co/TheBloke/WizardLM-70B-V1.0-GPTQ) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T06:45:23.824442](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-70B-V1.0-GPTQ/blob/main/results_2023-08-31T06%3A45%3A23.824442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.636465710040601,
"acc_stderr": 0.03280341903722105,
"acc_norm": 0.6402033295604549,
"acc_norm_stderr": 0.03278106024809485,
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5454276049890074,
"mc2_stderr": 0.015570490235725166
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670722,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038076
},
"harness|hellaswag|10": {
"acc": 0.6486755626369249,
"acc_stderr": 0.004764084597176899,
"acc_norm": 0.838478390758813,
"acc_norm_stderr": 0.00367259272936363
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.02914690474779833,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.02914690474779833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944413,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944413
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950357,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644234,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606647,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606647
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092448,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092448
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318667,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318667
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508773,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508773
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097655,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097655
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464085,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464085
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35195530726256985,
"acc_stderr": 0.015972668523689074,
"acc_norm": 0.35195530726256985,
"acc_norm_stderr": 0.015972668523689074
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5097783572359843,
"acc_stderr": 0.012767793787729338,
"acc_norm": 0.5097783572359843,
"acc_norm_stderr": 0.012767793787729338
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.02997280717046462,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.02997280717046462
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083376,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.02721283588407315,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.02721283588407315
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090081,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090081
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3818849449204406,
"mc1_stderr": 0.017008101939163495,
"mc2": 0.5454276049890074,
"mc2_stderr": 0.015570490235725166
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
EmpathyFirstMedia/algolia | 2023-08-30T04:10:46.000Z | [
"region:us"
] | EmpathyFirstMedia | null | null | null | 0 | 0 | Entry not found |
qazisaad/llama_2_product_titles-esci_train-temp-pos | 2023-08-30T09:08:39.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3631696
num_examples: 1560
download_size: 525794
dataset_size: 3631696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_product_titles-esci_train-temp-pos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama_2_product_titles-esci_train-temp-neg | 2023-08-30T11:11:04.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1894756
num_examples: 960
download_size: 289567
dataset_size: 1894756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_product_titles-esci_train-temp-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huynhloc04/start3D_DB | 2023-08-30T04:51:19.000Z | [
"region:us"
] | huynhloc04 | null | null | null | 0 | 0 | |
linhqyy/result_with_w2v2_baseline | 2023-08-30T04:31:06.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | Entry not found |
Hardy011295/Hardy2 | 2023-08-30T04:34:04.000Z | [
"region:us"
] | Hardy011295 | null | null | null | 0 | 0 | Entry not found |
qazisaad/llama_2_optimized_product_titles-esci-part1 | 2023-08-30T04:44:13.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5347864
num_examples: 1680
download_size: 1028985
dataset_size: 5347864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-part1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dweebishqys/hotels | 2023-08-30T04:56:36.000Z | [
"region:us"
] | dweebishqys | null | null | null | 0 | 0 | Entry not found |
dane85/repo_name | 2023-08-30T05:01:52.000Z | [
"region:us"
] | dane85 | null | null | null | 0 | 0 | Entry not found |
Samee-ur/tinystories_instruction_finetuning | 2023-08-30T05:02:26.000Z | [
"region:us"
] | Samee-ur | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 16192
num_examples: 114
download_size: 9330
dataset_size: 16192
---
# Dataset Card for "tinystories_instruction_finetuning"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
linhqyy/result_with_w2v2_originspknorm | 2023-08-30T05:05:17.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371835.027
num_examples: 1299
download_size: 164200997
dataset_size: 174371835.027
---
# Dataset Card for "result_with_w2v2_originspknorm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vierza/Dido_MajRealv2.5_v1 | 2023-08-30T05:08:38.000Z | [
"region:us"
] | Vierza | null | null | null | 0 | 0 | Entry not found |
eryttrytr/retret3545 | 2023-08-30T05:18:29.000Z | [
"region:us"
] | eryttrytr | null | null | null | 0 | 0 | Entry not found |
abdiharyadi/id_panl_bppt_with_amrbart_opus_mt_indobert_id_amr | 2023-08-30T05:35:53.000Z | [
"region:us"
] | abdiharyadi | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: translation
dtype:
translation:
languages:
- en
- id
- name: topic
dtype:
class_label:
names:
'0': Economy
'1': International
'2': Science
'3': Sport
- name: en_amr
dtype: string
- name: id_amr
dtype: string
splits:
- name: train
num_bytes: 583140
num_examples: 1220
download_size: 247241
dataset_size: 583140
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "id_panl_bppt_with_amrbart_opus_mt_indobert_id_amr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shyam-incedoinc/java_cucumber_code_train | 2023-08-30T05:42:47.000Z | [
"region:us"
] | shyam-incedoinc | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: step_file_name
dtype: string
- name: scenario
dtype: string
- name: step_function
dtype: string
- name: page_function
dtype: string
- name: step_function_scenario
dtype: string
- name: code
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 657798
num_examples: 248
download_size: 142096
dataset_size: 657798
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "java_cucumber_code_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ben141/demo1 | 2023-08-30T16:51:21.000Z | [
"region:us"
] | Ben141 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6 | 2023-08-30T05:53:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/llama-2-13B-ensemble-v6
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-13B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T05:52:04.564811](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6/blob/main/results_2023-08-30T05%3A52%3A04.564811.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5732546102102893,\n\
\ \"acc_stderr\": 0.034192375404008664,\n \"acc_norm\": 0.5769517967834359,\n\
\ \"acc_norm_stderr\": 0.034176064530211395,\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.01658630490176256,\n \"mc2\": 0.5264024071528917,\n\
\ \"mc2_stderr\": 0.016382172245984476\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5034129692832765,\n \"acc_stderr\": 0.014611050403244081,\n\
\ \"acc_norm\": 0.5221843003412969,\n \"acc_norm_stderr\": 0.014597001927076136\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6101374228241386,\n\
\ \"acc_stderr\": 0.004867221634461273,\n \"acc_norm\": 0.8095000995817566,\n\
\ \"acc_norm_stderr\": 0.003918928556590479\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107933,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107933\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6838709677419355,\n\
\ \"acc_stderr\": 0.026450874489042767,\n \"acc_norm\": 0.6838709677419355,\n\
\ \"acc_norm_stderr\": 0.026450874489042767\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.024985354923102325,\n\
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.024985354923102325\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.03228410626716391,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.03228410626716391\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803067,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803067\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.033509916046960415,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.033509916046960415\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240658,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240658\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.03512385283705048,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.03512385283705048\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722926,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722926\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905718,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905718\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.01624202883405362,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.01624202883405362\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.027956046165424516,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.027956046165424516\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291488,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291488\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.012682016335646673,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.012682016335646673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03025437257397671,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03025437257397671\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5816993464052288,\n \"acc_stderr\": 0.01995597514583555,\n \
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.01995597514583555\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.03071356045510849,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.03071356045510849\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3402692778457772,\n\
\ \"mc1_stderr\": 0.01658630490176256,\n \"mc2\": 0.5264024071528917,\n\
\ \"mc2_stderr\": 0.016382172245984476\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|arc:challenge|25_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hellaswag|10_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T05:52:04.564811.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T05:52:04.564811.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T05:52:04.564811.parquet'
- config_name: results
data_files:
- split: 2023_08_30T05_52_04.564811
path:
- results_2023-08-30T05:52:04.564811.parquet
- split: latest
path:
- results_2023-08-30T05:52:04.564811.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-13B-ensemble-v6
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-13B-ensemble-v6](https://huggingface.co/yeontaek/llama-2-13B-ensemble-v6) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T05:52:04.564811](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-13B-ensemble-v6/blob/main/results_2023-08-30T05%3A52%3A04.564811.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5732546102102893,
"acc_stderr": 0.034192375404008664,
"acc_norm": 0.5769517967834359,
"acc_norm_stderr": 0.034176064530211395,
"mc1": 0.3402692778457772,
"mc1_stderr": 0.01658630490176256,
"mc2": 0.5264024071528917,
"mc2_stderr": 0.016382172245984476
},
"harness|arc:challenge|25": {
"acc": 0.5034129692832765,
"acc_stderr": 0.014611050403244081,
"acc_norm": 0.5221843003412969,
"acc_norm_stderr": 0.014597001927076136
},
"harness|hellaswag|10": {
"acc": 0.6101374228241386,
"acc_stderr": 0.004867221634461273,
"acc_norm": 0.8095000995817566,
"acc_norm_stderr": 0.003918928556590479
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107933,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107933
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6838709677419355,
"acc_stderr": 0.026450874489042767,
"acc_norm": 0.6838709677419355,
"acc_norm_stderr": 0.026450874489042767
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.024985354923102325,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.024985354923102325
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.03228410626716391,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.03228410626716391
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803067,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803067
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.033509916046960415,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.033509916046960415
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240658,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240658
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.03512385283705048,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.03512385283705048
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722926,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722926
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905718,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905718
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.01624202883405362,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.01624202883405362
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.027956046165424516,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.027956046165424516
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291488,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291488
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.012682016335646673,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.012682016335646673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03025437257397671,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03025437257397671
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.01995597514583555,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.01995597514583555
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.03071356045510849,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.03071356045510849
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3402692778457772,
"mc1_stderr": 0.01658630490176256,
"mc2": 0.5264024071528917,
"mc2_stderr": 0.016382172245984476
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Danilla/banxmilna | 2023-08-30T06:40:45.000Z | [
"region:us"
] | Danilla | null | null | null | 0 | 0 | Entry not found |
Joshua8966/blog-writer_training-data-v30-8-2023 | 2023-08-30T06:43:19.000Z | [
"region:us"
] | Joshua8966 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 72881118
num_examples: 12174
download_size: 46279297
dataset_size: 72881118
---
# Dataset Card for "blog-writer_training-data-v30-8-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eduagarcia-temp/mc4_dedup | 2023-08-31T17:47:38.000Z | [
"region:us"
] | eduagarcia-temp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 488218826601
num_examples: 161689320
download_size: 52220169137
dataset_size: 488218826601
---
# Dataset Card for "mc4_dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
linhqyy/result_with_w2v2_spkn_ft_2e | 2023-08-30T06:18:33.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371742.027
num_examples: 1299
download_size: 164200565
dataset_size: 174371742.027
---
# Dataset Card for "result_with_w2v2_spkn_ft_2e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazisaad/llama_2_optimized_product_titles-esci-part2 | 2023-08-30T06:18:51.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: level_0
dtype: int64
- name: index
dtype: int64
- name: product_title
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: text
dtype: string
- name: preds
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1526227
num_examples: 480
download_size: 300628
dataset_size: 1526227
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2_optimized_product_titles-esci-part2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yeegnauh/bert_wikipedia | 2023-09-11T09:39:45.000Z | [
"license:apache-2.0",
"region:us"
] | yeegnauh | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
BrunoGR/Emo_support_simplified | 2023-08-30T06:56:57.000Z | [
"region:us"
] | BrunoGR | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: texto
dtype: string
- name: etiqueta
dtype: string
splits:
- name: train
num_bytes: 7654295
num_examples: 68791
- name: test
num_bytes: 3224404
num_examples: 27445
- name: validation
num_bytes: 253200
num_examples: 2200
download_size: 4825871
dataset_size: 11131899
---
# Dataset Card for "Emo_support_simplified"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
talentlabs/training-data-blog-writer_v30-08-2023 | 2023-08-30T06:44:59.000Z | [
"region:us"
] | talentlabs | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 72881118
num_examples: 12174
download_size: 46279297
dataset_size: 72881118
---
# Dataset Card for "training-data-blog-writer_v30-08-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bp120reviews/bp120premium | 2023-08-30T06:40:53.000Z | [
"region:us"
] | bp120reviews | null | null | null | 0 | 0 | <h1><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">➤➤BP120 Premium Blood Pressure Support – Official Website Link – Click Here</a></span></h1>
<p><strong>➤➤ Product Name - <a href="https://cinnachroma-reviews.blogspot.com/2023/08/bp120-premium-blood-pressure-support.html">BP120 Premium Blood Pressure Support</a><br /></strong></p>
<p><strong>➤➤ Quantity Per Bottle - 60 Capsules/Jar<br /></strong></p>
<p><strong>➤➤ Category - Blood Pressure Support<br /></strong></p>
<p><strong>➤➤ Compostion - Natural Components Only</strong></p>
<p><strong>➤➤ Results - In 14 Days</strong></p>
<p><strong>➤➤ Availability – Official Website <span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">www.BP120.com</a></span></strong></p>
<p><strong>➤➤ Rating: - 4.8/5.0 ★★★★☆</strong></p>
<h3><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">✅<strong>Click Here To Visit – OFFICIAL WEBSITE</strong>✅</a></span></h3>
<h3><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">✅<strong>Click Here To Visit – OFFICIAL WEBSITE</strong>✅</a></span></h3>
<h3><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">✅<strong>Click Here To Visit – OFFICIAL WEBSITE</strong>✅</a></span></h3>
<p><strong>Reviews on <a href="https://bp120-reviews.blogspot.com/2023/08/bp120-premium-blood-pressure-support.html">BP120 Premium Blood Pressure Support</a> :</strong> High blood pressure causes damage to arteries, which heightens the risk of cardiovascular disease, stroke, and other severe health problems. Sodium is an essential mineral that regulates fluid balance in the body, but high intake could raise blood pressure. Moreover, inflammation can damage arteries, and high LDL cholesterol levels can lead to arterial plaque. However, maintaining a healthy body balance of high-blood pressure risk factors can minimize the risk and support cardiovascular health. </p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiebRlwHDsAsbOQZZAmSCS3-2cHegG-SzMB6KzZTP4DMIGUu8zhZmYNFyvktXLEGIQHg1HlAhpRcNP_cn2h7QM4p7e7ZJ70q-aNVNeUz3ESoaA7iroRAXoyY5nLVnbsdEvUwxtnfuBbI_jO4ip9QXcef-TKnm4CkbX15_D4vjaPIdEvMyV3380Pm6dyAJhh/w640-h500/BP120%20Premium%20Blood%20Pressure%20Support%2011.png" alt="" width="640" height="500" border="0" data-original-height="442" data-original-width="565" /></a></div>
<p><a href="https://sites.google.com/view/bp120-/home">BP120 Premium Blood Pressure Support</a> has a proprietary blend of ingredients proven to reduce inflammation and enhance fluid balance, thus regulating blood pressure. According to the creator, the supplement enhances blood vessel dilation, increasing blood flow. Moreover, it eases water retention, protects the kidney, and supports cardiovascular health. What does the proprietary blend contain? This review has more information about the <a href="https://colab.research.google.com/drive/1ZMgiJpxh_rK2WjukdZ_qfJEHSicEN9-C">BP120 Premium Blood Pressure Support</a>.</p>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong>➦➦Visit The Official Website To Get BP120 Premium Blood Pressure Support Now!</strong></a></span></h2>
<h2><strong>What is BP120 Premium Blood Pressure Support?</strong></h2>
<p><a href="https://lookerstudio.google.com/reporting/12d91229-5962-41d9-9319-308861175453">BP120 Premium Blood Pressure Support</a> is an all-natural and effective advanced blood pressure support supplement to reactivate the Blood Pressure Release Valve in your heart.This ground-breaking formula uses natural ingredients, which help synthesize plant nutrients, starting with the Crocus flower extract.Each ingredient in this exceptional blend works in perfect harmony to arrange a symphony of advantages.The main objective is to reactivate the kidney-based Blood Pressure Release Valve. This mixture restores equilibrium while it works magic, enabling your body to reestablish control over blood pressure.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDEGiNrj8kdn9PBs3IP31J8UOIiKXulu5VGsMNmW2kf024AQGLHcneA0wRLrcRpiJmAXoSJVyKEFhK49o3QfMD9NfxgPS2QoTNUYvvYc5PrA-LXg_RWxebiKuWw9cPmqC61Ev16kYLwiZGQFgT6PqK8zk9HrYWmxJExelfNZYeWH_Gtp0xz3EuyJi2MrNT/w640-h310/BP120%20Premium%20Blood%20Pressure%20Support%207.jpg" alt="" width="640" height="310" border="0" data-original-height="598" data-original-width="1231" /></a></span></h2>
<p><a href="https://groups.google.com/g/bp120-premium-blood-pressure-support-official/c/LhgmFiPQc0U">BP120 Premium Blood Pressure Support</a> mainly helps to provide you with a path to optimal health where the forces of nature work together to promote your well-being.The supplement uses a special blend that helps the body eliminate excess fluid and salt and relaxes blood vessel walls to allow for unhindered blood flow.Read this complete review to know more about the <a href="https://bp120-updates.clubeo.com/calendar/2023/08/29/bp120-premium-supports-helps-to-improve-vascular-health-overall-blood-pressure-in-all-veins-arteries">BP120 Premium Blood Pressure Support</a> and to understand the complete benefits of the <a href="https://bp120-updates.clubeo.com/page/bp120-premium-blood-pressure-support-official-reviews-2023-works-starts-in-14-days-reduce-tiredness.html">BP120 Premium Blood Pressure Support</a>.</p>
<h2><strong><span style="color: red;">BP120 Premium Blood Pressure Support</span> - The Way Its Works For You?</strong></h2>
<p><a href="https://bp120-updates.clubeo.com/page/bp120-premium-clinically-proven-formulated-with-100-pure-ingredients-thats-controls-high-blood-pressure.html">BP120 Premium Blood Pressure Support</a> has been meticulously developed to solve this issue since it recognizes inflammation's crucial role in health. It strives to mitigate the negative effects of harmful chemicals that could jeopardize the health of your arteries.By doing this, it adopts a multifaceted strategy to protect against cardiovascular problems, promote a healthy blood flow, and instill calm inside the circulatory system.The formula for <a href="https://bp120-updates.clubeo.com">BP120 Premium Blood Pressure Support</a> is a blend of high-quality ingredients, each selected for its distinct qualities.This complex mixture helps your arteries thoroughly detoxify by using its strong cleansing qualities to remove hazardous toxin buildups.These accumulations may impede.Each ingredient in BP120 Premium Blood Pressure Support has been specifically chosen for its special qualities.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEglvRWlIEo8pC_lFbgApNjhztdsAkTbUBz4hEm0bROtMN4woHk6__ARERqhvsWkyX1DtBTRK4W_HDpORAK3eZZ8RtywETTMATJKLUKb68IO4qgJ2P8Um9fxsyZKon2ShTXRnwkXj_bkdaWg3k3OheNzUeNR3iPeyLAfXNvslljT7mXadtz0uUc_HOHysTeh/w640-h318/BP120%20Premium%20Blood%20Pressure%20Support%202.jpg" alt="" width="640" height="318" border="0" data-original-height="733" data-original-width="1475" /></a></span></h2>
<p>This complex mixture helps your arteries thoroughly detoxify by using its strong cleansing qualities to remove hazardous toxin buildups.These accumulations may block arteries, causing blood sugar obstructions that endanger general health.Increased levels of this TSC protein prevent more water molecules from flowing and instead cause them to become stuck in blood arteries.As a result, it raises the fluid level in the same area, which causes high blood pressure. To lower the TSC proteins and, it is important to REACTIVATE the Blood Pressure Release Valve.It happens due to the accumulation of fat and cholesterol in veins, which raises blood vessel pressure.</p>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong>➦➦Get your BP120 Premium Blood Pressure Support Here & Get Great Discount!</strong></a></span></h2>
<h2><strong>Natural Components in BP120 Premium Blood Pressure Support.<br /></strong></h2>
<ul>
<li><strong>BEET ROOT POWDER :</strong> Beets are one of only a few plant foods that contain dietary nitrates. Nitrates convert into nitric oxide, a molecule that dilates blood vessels to help increase blood flow, which helps pretty much every function in the body.</li>
<li><strong>AGED BLACK GARLIC EXTR ACT :</strong> Black garlic has shown promise in improving cardiovascular health by reducing arterial stiffness, elevated cholesterol levels and blood 'stickiness'.</li>
<li><strong>HAWTHORN BERRY POWDER :</strong> Hawthorn is used to protect against heart disease and help control high Blood Pressure and high cholesterol. Both animal and human studies suggest hawthorn increases coronary artery blood flow, improves circulation, and lowers Blood Pressure.</li>
</ul>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhi7TyyvgtIzcgpg-JYJ4-8zoMxksp5sGpxH7Ce40Ki4uz8TO2z2WCiB6dbzEWWO9Pc9TN8Gw6_PYmPzjINzEkXulXPgU7hZZU2Nb2647YPRaNOEtPuU23by2tvt9ypnjOiYLj5Wsr4VaGxNtu2NW3eriaTWAM20XnCa3HHjoPx5MgMAJS08CtEXVIe-udN/w640-h254/BP120%20Premium%20Blood%20Pressure%20Support%208.jpg" alt="" width="640" height="254" border="0" data-original-height="714" data-original-width="1800" /></a></span></h2>
<ul>
<li><strong>CoQ10 :</strong> CoQ10 has been shown to help alleviate heart stress. Especially for those already on prescribed medication, or who have had any heart procedures done. This is a basic, but essential binding ingredient for our life saving formulation</li>
<li><strong>Beta Cyclodextrin :</strong> Its sole job is to make this powerhouse formulation more bioavailable. Which means it'll work effectively, and quickly, to help you stabilize your blood pressure on a daily basis.</li>
<li><strong>L-Theanine :</strong> The naturally occuring, non-protein amino acid that has been shown to help reduce blood pressure and limit stress. It also is beneficial for a good night's sleep, which can help promote overall relaxation.</li>
</ul>
<h2><strong>The key benifits of BP120 Premium Blood Pressure Support.</strong></h2>
<p style="text-align: left;">BP120 Premium Blood Pressure Support helps to improve vascular health and overall blood pressure.With the help of this BP120 Premium Blood Pressure Support, it can help with appropriate systolic and diastolic pressure control.This supplement can also to enhancing platelet production and flow for effective clotting.This effective supplement can enhance blood vessel elasticity and strength.BP120 Premium Blood Pressure Support can also provide the artery cells with a beneficial anti-inflammatory reaction.BP120 Premium Blood Pressure Support can increase the nitric oxide generation to widen blood vessels and reduce tiredness.This BP120 Premium Blood Pressure Support combats oxidative stress and restores its negative impact on cells.It will also to reducing cortisol and the stress response and also to properly increase bodily energyIt can also to enhancing kidney performance and properly lower the amount of cholesterol.It can properly boost the immune system to prevent illness and disease. </p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj3JhdveJXXanl80eSdytPDq0bK_ngO1jhCL3EUXXcbaLjrt_fU5_W_CDDjbf0p2zKwvN9ScZk_tecmQHznIMgz0DQtCXyDbIM-jYiL_iZaxfzDI64lpJo4bxQIYV0iMl9HcHHABP4F09NXWpvQsrJLtuUcK2CO7iJI7Vd_65nBYL8hdihsvoJ9-1wHf0Nz/w640-h260/BP120%20Premium%20Blood%20Pressure%20Support%205.jpg" alt="" width="640" height="260" border="0" data-original-height="764" data-original-width="1881" /></a></div>
<h2><strong>Pros of BP120 Premium Blood Pressure Support :</strong></h2>
<ul>
<li><strong>Lower's Blood Pressure.</strong></li>
<li><strong>Boost Your Overall Heart Health.</strong></li>
<li><strong>Powerful Antiodixant Function.</strong></li>
<li><strong>Supports Heart Rate.</strong></li>
<li><strong>Supports Cardiovascular Health.</strong></li>
<li><strong>Increases Metabolic Function.<br /></strong></li>
<li><strong>100% Natural, Certified Non-GMO.</strong></li>
<li><strong>Rigorous Quality Control Process.</strong></li>
</ul>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong>➦➦Order Here Your BP120 Premium Blood Pressure Support & Grab The Big Discount Right Now!</strong></a></span></h2>
<h2><strong>Cons of BP120 Premium Blood Pressure Support:</strong></h2>
<ul>
<li>Buy this BP120 Premium Blood Pressure Support only from the <strong>official website</strong>, not any other platform.</li>
<li>This supplement is not for lactating or pregnant women.</li>
</ul>
<h2><strong>How Safe Is BP120 Premium Blood Pressure Support?</strong></h2>
<p>BP120 Premium Blood Pressure Support is made with carefully chosen ingredients combined in a facility that upholds the highest standards of Good Manufacturing Practices.This facility, which is regularly undergone FDA audits, proves our commitment to your safety.Regular testing for purity and the absence of any potentially dangerous substances, including bacteria, pathogens, pesticides, and preservatives, is an essential component of our procedure.After these meticulous inspections, the entire batch is rejected if any problems are found, demonstrating our unshakable dedication to your well-being.So this supplement is considered safe and effective, which plays a vital role in showing the wide range of health benefits.</p>
<h2><strong>BP120 Premium Blood Pressure Support Usage.</strong></h2>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi6GdIMiotl5sE8xD9-DhjmfOEQqtZ6N-m7krrUDd9cM2qixpPkLOQKXF3jdHTfK4DR_W0Fjc8y-ohgUwv1khjV8Yku8L9zviFgOV2FUQGTyBry8X6ngXrgOfKkozY8ZRpxhh-VDtwL-4hrIwyPF07cDu-UnCDMiT7_xVHFJXwC3gMKlWCWl8CQtrsHHh7E/w640-h236/BP120%20Premium%20Blood%20Pressure%20Support%201.jpg" alt="" width="640" height="236" border="0" data-original-height="1002" data-original-width="2712" /></a></span></h2>
<p>Each BP120 Premium Blood Pressure Support bottle contains 60 capsules. The manufacturer recommends consumers take two pills daily. According to the creator, vegan capsules have been tested for purity and safety and have no known side effects. Daily supplement usage can enable consumers to maintain healthy blood pressure and support cardiovascular health. However, consumers under medication or with underlying health problems should consult their physicians before using the dietary supplement.</p>
<h2><strong>Detailed pricing of BP120 Premium Blood Pressure Support.</strong></h2>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg4k_-nwmNjqRTj6qk3mHPdu38E-mdTmI9qPyWSrCOCb07petlcUAHdoJzmsrCs6GJMdNXBfgPrywHxuI-okHPcJFUUHlPlC0DE6n-Q0X5nHCa0rFLhuueL0-CEaWiv87yH6nYwj6UGnik9sKelKAaPq1zj8aai7hxsoFyc-m0HnUK9s9muPLdj03RWpy7b/w640-h358/Screenshot%20(536).png" alt="" width="640" height="358" border="0" data-original-height="651" data-original-width="1161" /></a></span></h2>
<ul>
<li style="text-align: left;"><strong>Starter Pack ➣➣ Buy One Bottle of BP120 Premium Blood Pressure Support $59 + Free US Shipping.</strong></li>
<li style="text-align: left;"><strong>Most Popular Pack ➣➣ Buy Three Bottles of BP120 Premium Blood Pressure Support $147 [USD 49/bottle]+ Free US Shipping.</strong></li>
<li style="text-align: left;"><strong><span style="background-color: #339966; color: white;">Best Value Pack ➣➣ Buy Six Bottles of BP120 Premium Blood Pressure Support $234 [USD 39/bottle]+ Free US Shipping.</span> <span style="color: green;">✔✔</span><br /></strong></li>
</ul>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong>➦➦Just Click Here To Visit the Official Website & Buy BP120 Premium Blood Pressure Support!</strong></a></span></h2>
<h2>Money back guarantee on BP120 Premium Blood Pressure Support.</h2>
<p>A 90-day money-back guarantee backs each purchase of BP120 Premium Blood Pressure Support. Consumers who find the product unsatisfactory should return the BP120 Premium Blood Pressure Support and get a full refund. However, they’ll have to cater for the shipping and handling fee. For more information <a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><span style="background-color: yellow; color: red;"><strong><<<<CLICK HERE>>>></strong></span></a></p>
<h2><strong>FAQ on BP120 Premium Blood Pressure Support</strong></h2>
<p><strong>Q. What Is The Full List of Ingredients in BP120 Premium Blood Pressure Support? And Are There Any Allergens?</strong></p>
<p><strong>Ans.</strong> BP120 Premium Blood Pressure Support contains clinical doses of the following:</p>
<ul>
<li><strong>CoQ10</strong></li>
<li><strong>L-Theanine</strong></li>
<li><strong>BEET ROOT POWDER</strong></li>
<li><strong>AGED BLACK GARLIC EXTRACT</strong></li>
<li><strong>HAWTHORN BERRY POWDER</strong></li>
<li><strong>Beta Cyclodextrin</strong></li>
</ul>
<p>Each ingredient is sourced from non-GMO plant extracts. BP120 Premium Blood Pressure Support is free of any allergens, including gluten, dairy, soy, shellfish, and tree nuts.</p>
<p><strong><span class="faq__question--pre">Q</span>.<span class="faq__question__text">How Long BP120 Premium Blood Pressure Support Take To See Results?</span></strong></p>
<p><strong><span class="faq__question__text">Ans. </span></strong>Most users see results within the first week. For some , it may take 14 days to see your blood pressure start to come down. The best results will be seen after 60 days because the longer you use BP120 Premium Blood Pressure Support, the better the results typically are!</p>
<p><strong><span class="faq__question__text">Q.Will BP120 Premium Blood Pressure Support Work For Me?</span></strong></p>
<p><span class="faq__question__text"><strong>Ans.</strong> Yes! BP120 Premium Blood Pressure Support is formulated with the highest quality ingredients, in the exact doses needed, that have been used for centuries and have been clinically proven.</span></p>
<p><strong><span class="faq__question__text">Q.How Does The Refund Policy Work?</span></strong></p>
<p><span class="faq__question__text"><strong>Ans.</strong> Your BP120 Premium Blood Pressure Support's Bottles comes with a 90 Days, 100% Money Back Guarantee. That means if you change your mind about this decision at any point in the next 3 months – all you need to do is email us or call our customer support team, they'll give you a return address where you can ship both your empty and full products and we’ll refund your purchase. The shipping and postage to cover the return will be paid by the customer.</span></p>
<h2><strong>BP120 Premium Blood Pressure Support Conclusion.</strong></h2>
<p>BP120 Premium Blood Pressure Support can remedy individuals with mild to moderate hypertension. According to the creator, it has ingredients whose purity and potency are scientifically tested to aid blood pressure effectively. The formula’s constituents have an active compound that combats oxidative stress, reduces inflammation, and has antihypertensive effects.Moreover, the formula inhibits oxidation of LDL cholesterol levels, promotes blood vessel relaxation, and improves blood flow. The dietary capsules enhance blood circulation, supporting heart function and kidney health. Consumers can choose their preferred pack on the official website and maintain healthy blood pressure levels.</p>
<h2 style="text-align: center;"><span style="background-color: maroon; color: white;"><a style="background-color: maroon; color: white;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong>➦➦Visit the Official Website Today and Grab Your Bottle!</strong></a></span></h2>
<p><a href="https://cinnachroma-reviews.blogspot.com/2023/08/bp120-premium-blood-pressure-support.html">https://cinnachroma-reviews.blogspot.com/2023/08/bp120-premium-blood-pressure-support.html</a></p>
<p><a href="https://bp120-reviews.blogspot.com/2023/08/bp120-premium-blood-pressure-support.html">https://bp120-reviews.blogspot.com/2023/08/bp120-premium-blood-pressure-support.html</a></p>
<p><a href="https://sites.google.com/view/bp120-/home">https://sites.google.com/view/bp120-/home</a></p>
<p><a href="https://colab.research.google.com/drive/1ZMgiJpxh_rK2WjukdZ_qfJEHSicEN9-C">https://colab.research.google.com/drive/1ZMgiJpxh_rK2WjukdZ_qfJEHSicEN9-C</a></p>
<p><a href="https://lookerstudio.google.com/reporting/12d91229-5962-41d9-9319-308861175453">https://lookerstudio.google.com/reporting/12d91229-5962-41d9-9319-308861175453</a></p>
<p><a href="https://groups.google.com/g/bp120-premium-blood-pressure-support-official/c/LhgmFiPQc0U">https://groups.google.com/g/bp120-premium-blood-pressure-support-official/c/LhgmFiPQc0U</a></p>
<p><a href="https://bp120-updates.clubeo.com/calendar/2023/08/29/bp120-premium-supports-helps-to-improve-vascular-health-overall-blood-pressure-in-all-veins-arteries">https://bp120-updates.clubeo.com/calendar/2023/08/29/bp120-premium-supports-helps-to-improve-vascular-health-overall-blood-pressure-in-all-veins-arteries</a></p>
<p><a href="https://bp120-updates.clubeo.com/page/bp120-premium-blood-pressure-support-official-reviews-2023-works-starts-in-14-days-reduce-tiredness.html">https://bp120-updates.clubeo.com/page/bp120-premium-blood-pressure-support-official-reviews-2023-works-starts-in-14-days-reduce-tiredness.html</a></p>
<p><a href="https://bp120-updates.clubeo.com/page/bp120-premium-clinically-proven-formulated-with-100-pure-ingredients-thats-controls-high-blood-pressure.html">https://bp120-updates.clubeo.com/page/bp120-premium-clinically-proven-formulated-with-100-pure-ingredients-thats-controls-high-blood-pressure.html</a></p>
<p><a href="https://bp120-updates.clubeo.com">https://bp120-updates.clubeo.com</a></p>
<p> </p> |
bponetwenty/BP120-Premium | 2023-08-30T06:46:16.000Z | [
"region:us"
] | bponetwenty | null | null | null | 0 | 0 | <h1 style="text-align: left;">BP120 Premium Blood Pressure</h1>
<p><span style="font-family: georgia;"><strong>Product Name - BP120 Premium<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Side Effects - No Side Effects (100% Natural)</strong></span></p>
<p><span style="font-family: georgia;"><strong>Main Benefits - Support Healthy Blood Pressure & Sugar <br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Category - Blood Sugar Formual<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Results - In 1-2 Months</strong></span></p>
<p><span style="font-family: georgia;"><strong>Availability - Online</strong></span></p>
<p><span style="font-family: georgia;"><strong>Customer Reviews - ★★★★✰ 4.9/5</strong></span></p>
<p><span style="font-family: georgia;"><strong>Price - Visit <a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">Official Website</a></strong></span></p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><span style="font-family: georgia;"><strong><span style="color: red;"><span style="background-color: #ffe599;">Get Huge Discount Now!!</span></span></strong></span></a></h3>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong><span style="font-family: georgia;"><span style="background-color: #fff2cc;"><span style="color: red;">Special Discount- As Low As On BP120 Premium – Get Your Best Discount Online Hurry!!</span></span></span></strong></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjHIkUMj-RZm9PgFNlzrsthLuvJIHVZzIKXwaZsVbQbDcNlI6yC8QTb1uNpAqfhzF0Pkjq_tfgt47aPDYZAaiw9jexuGZ-CXp0E6tIGmBHLwhU0-BE3OTzHryjRiQKwbpl9dybEt3fE64aaJnovSbh6h0WbnsaUOhZ1bZkGYV2ro_okH7ON_WhoTfpHZFrE/w640-h236/BP120%20Premium%20Blood%20Pressure%20Support%201.jpg" alt="" width="640" height="236" border="0" data-original-height="1002" data-original-width="2712" /></a></div>
<p>High blood pressure has become a common condition, with almost one in three adults in the US diagnosed with hypertension. The condition can lead to heart failure, stroke, kidney failure, loss of vision, and other issues.</p>
<p>High blood pressure occurs when the pressure of circulating blood pushes against the walls of the arteries rises. You can live for many years without knowing you have the disease, as it shows no symptoms sometimes.</p>
<p>Using all-natural supplements ensures that you do not face harmful side effects. <a href="https://www.podcasts.com/bp120-premium-blood-pressure">BP120 Premium</a> Blood Pressure dietary supplement provides eight vital bio-available fruit flower bark extracts that contain the necessary minerals and vitamins to support healthy blood pressure. The supplement offers fulfilling and permanent results.</p>
<p>Continue reading this review to learn whether BP120 Premium Blood Pressure works, its ingredients, benefits, pros, cons, and pricing.</p>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong><span style="font-family: georgia;"><span style="background-color: #d9d2e9;"><span style="color: red;">SALE IS LIVE</span></span></span></strong></a></h2>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong><span style="font-family: georgia;"><span style="background-color: #ffe599;">Get <span style="color: red;">BP120 Premium </span> “Now Available” Hurry Limited Time Offer Only For 1st User!!</span></span></strong></a></h2>
<h2 style="text-align: left;"><strong>What is <a href="https://www.eventcreate.com/e/bp120-premium-blood-pressure">BP120 Premium Blood Pressure</a>?</strong></h2>
<p><a href="https://bitbucket.org/bp120-premium/bp120-premium/issues/1/bp120-premium-blood-pressure-formula">BP120 Premium</a> Blood Pressure is a dietary supplement formulated to support healthy blood pressure. The supplement comprises pure and organic ingredients that will enable you to say goodbye to blood pressure problems for good.</p>
<p>Each ingredient in BP120 Premium Blood Pressure is backed with research that proves its effectiveness. The safe and pure ingredients ensure that users are free from any side effects. BP120 Premium Blood Pressure is formulated using the right concentration of ingredients for maximum results.</p>
<p>The supplement is in capsule form, which is easy to swallow. BP120 Premium Blood Pressure is vegan-friendly, GMO-free, and toxin-free. Customers who have used the product have reported significant improvement.</p>
<p>Besides lowering blood pressure, BP120 Premium Blood Pressure also prevents other blood pressure-related problems. The supplement relieves blood pressure symptoms such as fatigue, chest pain, and stress.</p>
<h3 style="text-align: center;"><span style="background-color: #d9ead3;"><span style="color: red;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">LIMITED TIME OFFER</a></strong></span></span></span></h3>
<h3 style="text-align: center;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">Click Here to Order BP120 Premium at Special Discounted Price</a></strong></span></span></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjZVX3z63ppsP0j3fkOcY7XIQ8FBopfzh2UPOcAh0R0X1bmwGeBV8o5DD9-JdpVgVsojhwqN_EggLTsTBS-g0HDBj7Nu0uXo-YPw7LAvpv6ct8P9R_BR5rQRbFWAdlDS7HHRjhKaL0bbuMetcaujPv3cgwZQWm0QJAyeYQ0lpnagt392jT34o_h0LDu4ENC/w508-h381/BP120%20Premium%20Blood%20Pressure%20Support%2010.png" alt="" width="508" height="381" border="0" data-original-height="1050" data-original-width="1400" /></a></div>
<h2 style="text-align: left;"><strong>Does <a href="https://bp120-premium-blood-pressure.mystrikingly.com/">BP120 Premium Blood Pressure</a> Work?</strong></h2>
<p>When it comes to supporting healthy Blood Pressure, the BP120 Premium Blood Pressure supplement has gained popularity among diabetics. However, this remarkable supplement offers even more benefits to the body, thanks to its natural antioxidants.</p>
<p>Understanding what <a href="https://www.podcasts.com/bp120-premium-blood-pressure/episode/bp120-premium-blood-pressure-formula-maintaining-healthy-glucose-metabolism-energy-levelswork-or-hoax">BP120 Premium</a> Blood Pressure is and how it works is crucial in determining if it’s the right solution for managing your diabetes. The main target of BP120 Premium Blood Pressure is a specific molecule in the blood that can wreak havoc on your health.</p>
<p>This molecule, called ceramide, is known to contribute to the accumulation of fat in your blood vessels, particularly in the arteries. It can also lead to arterial stiffness, which is a concerning issue.</p>
<p>Additionally, ceramide triggers the storage of fat in various organs, such as the heart, kidneys, liver, and pancreas. These organs are particularly vulnerable in individuals with type 2 diabetes. It’s crucial to address this molecule to maintain optimal health and well-being.</p>
<p>The natural antioxidants present in this supplement work together to combat the harmful effects of ceramide, promoting a healthier cardiovascular system and overall well-being.</p>
<p>So, if you’re looking for a friendly and effective way to manage your Blood Pressure levels, consider giving BP120 Premium Blood Pressure a try. As per the BP120 Premium Blood Pressure customer reviews online, thousands of individuals have reaped the incredible benefits it can offer and take control of their health.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">MUST SEE: <span style="background-color: #ffe599; color: red;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">“Critical News BP120 Premium Report – They Will Never Tell You This”</a></span></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjFvRMtV66XvaIPkSdnDXM5MZYq09-Q-e3ZFf92suam80D7qx-5NtzPIGa4QyNWlEES7aCStyXijuP0gu2aDpF1xP3q80GoM3q6ewFdpGhqeRWfkDvtJja8pp_0mf3xUOsG9vqyoIKuaZF0AD-RWD9_kik58f6W1ZnE9hYD91lqnY3nvRbz1d4UoFj0pD19/w640-h332/BP120%20Premium%20Blood%20Pressure%20Support%203.jpg" alt="" width="640" height="332" border="0" data-original-height="772" data-original-width="1485" /></a></div>
<h2 style="text-align: left;"><strong><a href="https://bp120-premium-blood-pressure-support-reviews.jimdosite.com/">BP120 Premium Blood Pressure</a> Ingredients:</strong></h2>
<p>Let’s take a closer look at the incredible ingredients that make up <a href="https://bp120-premium-report.clubeo.com/page/bp120-premium-blood-pressure-formula-maintaining-healthy-glucose-metabolism-energy-levels-work-or-hoax.html">BP120 Premium</a> Blood Pressure and how they contribute to supporting your Blood Pressure levels.</p>
<p><strong>Cayenne:</strong> Spice up your life with cayenne pepper, an ingredient that plays a significant role in BP120 Premium Blood Pressure. Cayenne contains a compound called capsaicin, known for its positive effects on individuals with type 2 diabetes.</p>
<p>In fact, studies have shown that consuming capsaicin led to a significant reduction in blood glucose levels in diabetic rats. The researchers attributed this effect to increased glycogen and insulin levels.</p>
<p><strong>Vitamin C:</strong> You probably already know the importance of vitamin C in supporting overall health, but did you know it can also help regulate Blood Pressure levels? Vitamin C in the BP120 Premium Blood Pressure formula serves as a potent and safe nutrient.</p>
<p>It not only aids in controlling Blood Pressure levels after meals but also offers additional benefits for your eyes, cardiovascular system, and immune function.</p>
<p><strong>L-Taurine:</strong> Say hello to L-Taurine, an amino acid that works wonders for your cardiovascular function, bile acids, calcium signaling, and antioxidation. It’s a multitasking ingredient that also plays a role in preventing tubulointerstitial injuries and reducing the risk of diabetes-related complications, such as microangiopathies.</p>
<p><strong>Banaba Leaf:</strong> Another fantastic ingredient found in the BP120 Premium Blood Pressure pills is the banaba leaf, native to India. The star of the show in the banaba leaf is corosolic acid, known for its antioxidant and antihyperlipidemic properties.</p>
<p>These properties can enhance glucose uptake and aid in the management of lipid metabolism. With its rapid lipid metabolization abilities, banaba leaf truly stands out as a valuable addition to BP120 Premium Blood Pressure.</p>
<p><strong>Bitter Melon: </strong>Imagine a vegetable with a bitter taste, similar to a cucumber but without the water content. That’s bitter melon! Recent research has shown that bitter melon, when ingested, provides energy to our cells and activates them through a protein called activated protein kinase (APK). This activation can enhance glucose tolerance and promote fat oxidation, making bitter melon a valuable addition to BP120 Premium Blood Pressure.</p>
<p><strong>Licorice: </strong>The roots of the licorice plant have long been recognized for their medicinal properties, and they are a valuable component in the formula. Traditional medicine considers licorice roots as an excellent alternative to sugar.</p>
<p>In a study involving rats, those who received a daily dose of 1g of licorice per 1kg of body weight for two months experienced a reversal of adverse diabetes effects.</p>
<p>Similarly, a human study showed that incorporating dried licorice extracts into a calorie-restricted diet led to improvements in various health markers, including fat mass, insulin resistance, and vaspin serum levels.</p>
<p><strong>Magnesium: </strong>Research has shown that many diabetic patients also experience symptoms of magnesium deficiency. That’s why BP120 Premium Blood Pressure includes magnesium as an active ingredient in its formula.</p>
<p>Magnesium plays a crucial role in regulating Blood Pressure levels and offers a range of additional benefits. From lowering blood pressure to boosting exercise performance, treating migraines, and reducing inflammation, magnesium is a true powerhouse for your well-being.</p>
<p><strong>Guggul: </strong>Originating from a tree in India, guggul has gained popularity for its healing properties found in its resin. A study specifically focused on guggul revealed its potential to help manage triglyceride and cholesterol levels, offering support for your overall cardiovascular health.</p>
<h3 style="text-align: left;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong>To Learn More about BP120 Premium Ingredients in Detail, Click Here to Head to Its Official Website</strong></span></span></span></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjM-qrUhxsZoYzMV30CF7ajhb0kz4UMKQisq2anywsYO5IvgWTh1-MGJDrmtRTlEuMqXrc0cBCowdBX2TWFJ0zymQX80abt5coKOF_tEBBIIQYJ_6ui7SsA2DImofeI0wc5ZKWcnap7C2vbMBuGOofp_zkxiFEY_y7gx4mSXhaJv-lKuAR2daw715bWqXxR/w640-h254/BP120%20Premium%20Blood%20Pressure%20Support%208.jpg" alt="" width="640" height="254" border="0" data-original-height="714" data-original-width="1800" /></a></div>
<h2 style="text-align: left;"><strong>Benefits of Using <a href="https://bp120-premium-blood-pressure-support.company.site/">BP120 Premium Blood Pressure</a><br /></strong></h2>
<p><strong>Support Healthy Blood Pressure Levels:</strong> If you’re struggling to keep your Blood Pressure within a normal range, <a href="https://bp120-premium.hashnode.dev/bp120-premium-blood-pressure-formula-maintaining-healthy-glucose-metabolism-energy-levelswork-or-hoax">BP120 Premium</a> Blood Pressure is here to help. Diabetics and others facing Blood Pressure control challenges can benefit from this supplement.</p>
<p>By incorporating BP120 Premium Blood Pressure into your routine, it aims to support your Blood Pressure goals and assist in maintaining healthy levels throughout the day.</p>
<p><strong>Improve Overall Blood Circulation:</strong> BP120 Premium Blood Pressure goes beyond Blood Pressure support and also works to improve your overall blood health. Packed with a blend of powerful antioxidants, it contributes to better blood circulation throughout your body. When your blood flows effortlessly, it reduces the strain on your heart and vital organs, promoting better cardiovascular health.</p>
<p><strong>Increases Vigor & Energy:</strong> Experience a boost in vitality and energy with BP120 Premium Blood Pressure. The formula is designed to enhance your overall energetic well-being, fight fatigue, and increase vitality levels.</p>
<p>Many of the natural antioxidants found in BP120 Premium Blood Pressure also have anti-inflammatory properties that can positively impact your energy, stamina, and overall health and wellness.</p>
<p><strong>Works for Everyone:</strong> BP120 Premium Blood Pressure has shown proven Blood Pressure-supporting effects across various age groups, from individuals in their 30s to those in their 70s.</p>
<p>The manufacturer proudly claims that the supplement is 100% natural, safe, and effective, with thousands of users reporting no side effects. Hence, you won’t find any BP120 Premium Blood Pressure complaints online.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Read This: <span style="background-color: #ffe599; color: red;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">"More Information From Knowledgeable Expertise of Health Labs BP120 Premium"</a></span></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgac0jXuarBkKblc1gXkcKs8ehcYattgLpvbd9WhGGsRCir7zz0T9SksPmOKkU61Oe2KTHwyHDjqC212ui8tzz_ri-ZQm31ThgFvNPj5Ztpdc-bAZhs3O7Ief_pyuHmyi0WLdA56KiUncKzV-vQGm-xP_-Vl1a6sD4KNMgS-xOjuiU7oVNBA4YBPveyAO1H/w640-h258/BP120%20Premium%20Blood%20Pressure%20Support%206.jpg" alt="" width="640" height="258" border="0" data-original-height="825" data-original-width="2049" /></a></div>
<h2 style="text-align: left;"><strong>Safety and Side Effects Of <a href="https://bp120-premium-blood-pressure-support.webflow.io/">BP120 Premium Blood Pressure</a> Supplement</strong></h2>
<p>According to third-party clinical trials, the <a href="https://haitiliberte.com/advert/bp120-premium-blood-pressure-formula-maintaining-healthy-glucose-metabolism-energy-levelswork-or-hoax/">BP120 Premium</a> Blood Pressure Blood Pressure pills are entirely safe for regular intake. At the same time, the formula is fully natural and contains no GMOs, allergens, chemicals, or other harm-causing substances. On account of this, you can rest assured that there is a zero percent chance for it to trigger any adverse results or side effects.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>IMPORTANT: <span style="background-color: #ffe599;"><span style="color: red;">Shocking Truth About BP120 Premium – This May Change Your Mind!</span></span></strong></span></h3>
<h2 style="text-align: left;"><strong><a href="https://sites.google.com/view/bp120-premium/">BP120 Premium</a> Advantages</strong></h2>
<p>• Unlike a clinical drug, this product can reduce high blood pressure and increase low blood pressure.</p>
<p>• The ingredients are all naturally sourced and won’t invoke any side effects.</p>
<p>• The manufacturing of each tablet is regulated under the cGMP guidelines; therefore, safety is guaranteed.</p>
<p>• Your blood pressure starts to fall into the deal range within 30 days of regular use.</p>
<p>• You don’t need a doctor’s prescription to purchase or use this <a href="https://bp120-premium-report.clubeo.com/calendar/2023/08/29/bp120-premium-scientific-secret-premium-heart-health-formula-to-maintain-blood-pressure-and-flow-spam-or-legit">BP120 Premium</a> Blood Pressure Support.</p>
<h2 style="text-align: left;"><strong><a href="https://colab.research.google.com/drive/1H1YcK6azdsg_ZT20NX2U_2n0p_cE71MO?usp=sharing">BP120 Premium</a> Disadvantages</strong></h2>
<p>• In some people, the formulation was ineffective, and it may take longer than usual to evoke a positive response.</p>
<p>• Acidity is reported by some users, especially when the pills are consumed on an empty stomach.</p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong style="background-color: #ffe599; color: red; font-family: georgia;">TAKE ADVANTAGE OF THIS LIMITED OFFER TO STOCK UP ON BP120 Premium WHILE SUPPLIES LAST!</strong></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEheMQpepLD_GtQSGJJfPjjwKRF5f-clGiBG_PVvCCHRCTWuh38h1o2Czcrwdo2FQq7pS0ZAgw0Vvjd4SrCvkYGgsIV4R-3GIuIB_CWZZWTHHNFyaFsUX0zNan0MJJ69KepfW3ZIE22nsqhr-P_s6zVb5Ep1QgAKrw8R68Lz4Wt4dd-OAwiDEMMV35ehVnD3/w640-h310/BP120%20Premium%20Blood%20Pressure%20Support%207.jpg" alt="" width="640" height="310" border="0" data-original-height="598" data-original-width="1231" /></a></div>
<h2 style="text-align: left;"><strong><a href="https://groups.google.com/g/bp120-premium/c/z7FV4QLgoAs">BP120 Premium Blood Pressure</a> Usage Guidelines</strong></h2>
<p>Taking <a href="https://erectin-report.clubeo.com/calendar/2023/08/29/bp120-premium-formula-100-effected-blood-pressure-support-neuro-vescular-support-pills-spam-or-legit">BP120 Premium</a> Blood Pressure is incredibly easy and fits seamlessly into your daily routine. Here’s a friendly guide on how to incorporate BP120 Premium Blood Pressure into your life:</p>
<p>• Each bottle of BP120 Premium Blood Pressure contains 60 capsules, which equates to 60 servings.</p>
<p>• Simply take one capsule of BP120 Premium Blood Pressure every day with your evening meal.</p>
<p>• Wash it down with a half glass of water, and you’re good to go!</p>
<p>The manufacturer assures that by following these guidelines, you’ll love the results and the positive impact it has on how you feel.</p>
<p>Make BP120 Premium Blood Pressure a part of your daily wellness routine and discover the friendly and effortless way to support your Blood Pressure and overall well-being.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>READ ALSO: <span style="background-color: #ffe599;"><span style="color: red;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">Does the BP120 Premium Work For Everyone? Before you buy, read real customer reviews and testimonials!!</a></span></span></strong></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjNMEXLEJaMg2iZHoeEVmOY6RK5vQMQxkxZv5Itx80ncfqYK3TF6YMo6an1GutQMqn6KN2wbx6ME-ZyY4YnlqPBmoLFAIgd7gHOASzWg4w3ci-xFBohNiL9soaWaw5dsRkVAOXf3T7LhrH2bGkE70LNnbOmVpQLEGUYRRUmQVek467ZFu6kuEdplfN8ne1i/w640-h318/BP120%20Premium%20Blood%20Pressure%20Support%202.jpg" alt="" width="640" height="318" border="0" data-original-height="733" data-original-width="1475" /></a></div>
<h2 style="text-align: left;"><strong>How Much Do <a href="https://lookerstudio.google.com/reporting/02b0e75a-3f0e-4bfd-9150-e38f4f888231">BP120 Premium Blood Pressure</a> Cost and Where to Buy?</strong></h2>
<p><a href="https://soundcloud.com/bp120-premium/bp120-premium-blood-pressure-formula-maintaining-healthy-glucose-metabolism-energy-levels">BP120 Premium</a> Blood Pressure is only available via the user-friendly and secure official website. Customers can make their orders quickly and without any hassles. All American orders attract free shipping and reduced prices on bulk orders. BP120 Premium Blood Pressure makers give customers a tracking identity number after placing an order. Similarly, it takes 3-7 working days for the shipping process to complete.</p>
<p>You can pay online using any major credit card (Visa, MasterCard, American Express, or Discover) or PayPal, among other payment options. All BP120 Premium Blood Pressure purchases come with a 90-day money-back guarantee. You can try the supplement for one year and request a refund if it does not work.</p>
<p><strong>These are the BP120 Premium costs which decline while getting more units simultaneously:</strong></p>
<p><strong>Basic -</strong> 1 Bottle Supply of BP120 Premium USD 59/bottle + SMALL SHIPPING.<span style="color: red;"><br /></span></p>
<p><strong>Popular Pack -</strong> Buy 3 Get Bottle Supply of BP120 Premium USD 49/bottle + SMALL SHIPPING.</p>
<p><strong>Best Value Pack - </strong>Buy 6 Bottle Supply of BP120 Premium USD 39/bottle + FREE SHIPPING.</p>
<p style="text-align: left;"><a href="https://bp120-premium.bandcamp.com/track/bp120-premium-blood-pressure-formula-maintaining-healthy-glucose-metabolism-energy-levels-work-or-hoax">BP120 Premium</a> Payments are made using 256-bit SSL technology to keep information safe and secure, and all orders arrive within a few business days of ordering.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Special Offer: <span style="background-color: #fff2cc; color: red;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">Click Here To Get Heavy Discount Instantly!!</a></span></strong></h3>
<p><span style="font-family: times;"><span style="font-size: medium;"><span style="color: red;">Good News: Get additional discount on shipping when you checkout with Mastercard or Discover card!</span></span></span></p>
<div class="separator" style="clear: both; text-align: center;">
<p style="text-align: left;"><span style="font-size: medium;"><a style="clear: left; float: left; margin-bottom: 1em; margin-left: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/a/AVvXsEgJqDXBj2s2sKgxhjLGKnDNPxD392fUjUkF8lQbqbuoFZwPHnPE27muXA18Hs1EzbsUHHsPlOR9Njx119fwMPFiCrLv9NlRRfEUdLPeIVlqZmqjexv1dJ0pMoSO6VUtSY89rewM_LiPyGpkGpNCHHdprDSvrWyt6MprtcceNFal6bdDPK_FyvLHnQzy-A" alt="" width="110" height="120" border="0" data-original-height="120" data-original-width="110" /></a><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">APPROVED!</span><br /></strong></span></span></span></p>
<p style="text-align: left;"><span style="font-family: helvetica;"><span style="font-size: small;">Limited supply available. We currently have product in stock and ready to ship within <span style="color: red;">24 hours</span>.</span></span></p>
</div>
<p><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">EXPIRE SOON</span></strong></span></span></p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj9Ypptpi1S8D2MEExW4NthbiA8bfs1nl3AMz3ImkmWjt2i81vME-zhaJKR32gkWX2lTJbEL0CvYxIWXZtzDJLhnxwlwo4NVP2QrS1xKw7vd18ynNjdXpIiM9n2bqvunauYm59zo-OxkKj-BM6f33tuYH0WlXaaQzJQ13vTuRhYuwPLKEwmD6rLpKSm4jS3/w480-h480/BP120%20Premium%20Blood%20Pressure%20Support%209.png" alt="" width="480" height="480" border="0" data-original-height="1400" data-original-width="1400" /></a></div>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhSJvORHtAeEI3H2rypjo7v70Cm2j2tC1B-Ja0K1qVp1MEYhmISktm3oeSPvmtOjcgIp6VWYex2WQ2w6gsXFZPdis4AmxfwRGftHtwSK5PNs5-vJjhVZwsNY6SljpUWbanRSWMbVUibr78lOgAkjowIEGQGH8g4my7mrAF8bND5KSQ7K8qU9d1qadr8WA/w327-h97/btn.png" alt="" width="327" height="97" border="0" data-original-height="84" data-original-width="282" /></a></div>
<p style="text-align: center;">By submitting, you affirm to have read and agreed to our <a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><span style="color: red;">Terms & Conditions</span></a>.</p>
<p style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><span style="font-size: medium;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong>HUGE SAVINGS Get Your BP120 Premium “Get Something OFF” Get 2+1 Offer Hurry Only For 1st User!!</strong></span></span></span></span></a></p>
<h2 style="text-align: left;"><strong>Conclusion</strong></h2>
<p>Millions of people across the world put their best efforts into retaining healthy Blood Pressure levels but fail miserably. Healthy Blood Pressure is the significator of optimal health and plenty of factors can lead to unbalanced Blood Pressure levels. If you have been diagnosed with fluctuating Blood Pressure levels lately, there’s no better remedy than BP120 Premium Blood Pressure.</p>
<p>Thousands of <a href="https://sway.office.com/QGQcddVIaxowl8gg?ref=Link&loc=mysways">BP120 Premium</a> Blood Pressure reviews have been uploaded on the official website to assure skeptical buyers about the product’s utility. Those who are seeking a natural and harmless Blood Pressure management formula should land on BP120’s official site instead of trying gimmicky Blood Pressure support supplements one after another.</p>
<p class="ql-align-center" style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHYxn3NMPQqsmKG54OjkQESM8dw8D7zUXtssdLHaaWSYArzmNucZfEfKCOBsnUqZdp6i-enO0zDWtMGF2pKG2MifoTldIDExJOBDWxicPkSeox29VCmqX6Cz2feNaSfYBnC_BHUdfPT1qUGVgSNyn0NtyKxY-V-M-BDbo5jCOW4qSuxwu3TOTA3dSjIQ/s1600/Screenshot%20(1445).png" alt="" width="320" height="114" /></a></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">Terms and Conditions</a></strong><strong> | </strong><strong><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">Privacy</a></strong><strong> | </strong><a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure"><strong>Contact Us</strong></a></span></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><strong>© 2023 <a href="https://www.healthsupplement24x7.com/get-bp120-premium-blood-pressure">BP120 Premium</a></strong><strong>. All Rights Reserved.</strong></span></p> |
keiphone/CustomConcept101-256x256 | 2023-08-30T06:54:06.000Z | [
"task_categories:text-to-image",
"region:us"
] | keiphone | null | null | null | 1 | 0 | ---
task_categories:
- text-to-image
---
256x256 version of CustomConcept101 dataset, refer to https://www.cs.cmu.edu/~custom-diffusion/dataset.html |
Code-Hugger/airfoil-2dsteady | 2023-08-30T06:58:59.000Z | [
"license:apache-2.0",
"region:us"
] | Code-Hugger | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Gauravvaid-shell/instruct-python-500k | 2023-08-30T06:54:41.000Z | [
"license:gpl-3.0",
"region:us"
] | Gauravvaid-shell | null | null | null | 0 | 0 | ---
license: gpl-3.0
dataset_info:
features:
- name: score_question
dtype: int16
- name: score_answer
dtype: int16
- name: question
dtype: string
- name: answer
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9577425
num_examples: 6494
download_size: 5755017
dataset_size: 9577425
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Gauravvaid-shell/instruct-python-llama2-20k | 2023-08-30T09:28:53.000Z | [
"license:gpl-3.0",
"region:us"
] | Gauravvaid-shell | null | null | null | 0 | 0 | ---
license: gpl-3.0
---
|
tyzhu/fwv2_random_rare_tip_train_10_eval_10 | 2023-08-30T07:23:20.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4445
num_examples: 30
- name: train_doc2id
num_bytes: 1821
num_examples: 20
- name: train_id2doc
num_bytes: 1881
num_examples: 20
- name: train_find_word
num_bytes: 2564
num_examples: 10
- name: eval_find_word
num_bytes: 1847
num_examples: 10
- name: id_context_mapping
num_bytes: 1241
num_examples: 20
download_size: 0
dataset_size: 13799
---
# Dataset Card for "fwv2_random_rare_tip_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_random_rare_tip_train_100_eval_100 | 2023-08-30T07:25:42.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 44575
num_examples: 300
- name: train_doc2id
num_bytes: 18225
num_examples: 200
- name: train_id2doc
num_bytes: 18825
num_examples: 200
- name: train_find_word
num_bytes: 25750
num_examples: 100
- name: eval_find_word
num_bytes: 18443
num_examples: 100
- name: id_context_mapping
num_bytes: 12425
num_examples: 200
download_size: 0
dataset_size: 138243
---
# Dataset Card for "fwv2_random_rare_tip_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_random_rare_tip_train_1000_eval_100 | 2023-08-30T07:28:01.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 360515
num_examples: 2100
- name: train_doc2id
num_bytes: 100243
num_examples: 1100
- name: train_id2doc
num_bytes: 103543
num_examples: 1100
- name: train_find_word
num_bytes: 256972
num_examples: 1000
- name: eval_find_word
num_bytes: 18440
num_examples: 100
- name: id_context_mapping
num_bytes: 68343
num_examples: 1100
download_size: 0
dataset_size: 908056
---
# Dataset Card for "fwv2_random_rare_tip_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_squad_rare_tip_train_10_eval_10 | 2023-08-30T07:30:31.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8076
num_examples: 30
- name: train_doc2id
num_bytes: 3546
num_examples: 20
- name: train_id2doc
num_bytes: 3606
num_examples: 20
- name: train_find_word
num_bytes: 4470
num_examples: 10
- name: eval_find_word
num_bytes: 2710
num_examples: 10
- name: id_context_mapping
num_bytes: 2966
num_examples: 20
download_size: 34497
dataset_size: 25374
---
# Dataset Card for "fwv2_squad_rare_tip_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_squad_rare_tip_train_100_eval_100 | 2023-08-30T07:33:14.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 81184
num_examples: 300
- name: train_doc2id
num_bytes: 36110
num_examples: 200
- name: train_id2doc
num_bytes: 36710
num_examples: 200
- name: train_find_word
num_bytes: 44474
num_examples: 100
- name: eval_find_word
num_bytes: 27815
num_examples: 100
- name: id_context_mapping
num_bytes: 30310
num_examples: 200
download_size: 165444
dataset_size: 256603
---
# Dataset Card for "fwv2_squad_rare_tip_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_squad_train_10_eval_10 | 2023-08-30T09:43:59.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3782
num_examples: 10
- name: eval_find_word
num_bytes: 3480
num_examples: 10
- name: validation
num_bytes: 3480
num_examples: 10
download_size: 18600
dataset_size: 10742
---
# Dataset Card for "fwv2_baseline_squad_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_squad_train_100_eval_100 | 2023-08-30T09:44:19.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 35816
num_examples: 100
- name: eval_find_word
num_bytes: 34800
num_examples: 100
- name: validation
num_bytes: 34800
num_examples: 100
download_size: 76678
dataset_size: 105416
---
# Dataset Card for "fwv2_baseline_squad_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_squad_train_1000_eval_100 | 2023-08-30T09:44:44.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 362526
num_examples: 1000
- name: eval_find_word
num_bytes: 34692
num_examples: 100
- name: validation
num_bytes: 34692
num_examples: 100
download_size: 261498
dataset_size: 431910
---
# Dataset Card for "fwv2_baseline_squad_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_squad_train_10000_eval_100 | 2023-08-30T09:45:08.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3612366
num_examples: 10000
- name: eval_find_word
num_bytes: 35542
num_examples: 100
- name: validation
num_bytes: 35542
num_examples: 100
download_size: 2150107
dataset_size: 3683450
---
# Dataset Card for "fwv2_baseline_squad_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_random_train_10_eval_10 | 2023-08-30T09:45:22.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1784
num_examples: 10
- name: eval_find_word
num_bytes: 1714
num_examples: 10
- name: validation
num_bytes: 1714
num_examples: 10
download_size: 3613
dataset_size: 5212
---
# Dataset Card for "fwv2_baseline_random_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_squad_rare_tip_train_1000_eval_100 | 2023-08-30T07:35:57.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 639518
num_examples: 2100
- name: train_doc2id
num_bytes: 196084
num_examples: 1100
- name: train_id2doc
num_bytes: 199384
num_examples: 1100
- name: train_find_word
num_bytes: 440134
num_examples: 1000
- name: eval_find_word
num_bytes: 27405
num_examples: 100
- name: id_context_mapping
num_bytes: 164184
num_examples: 1100
download_size: 996873
dataset_size: 1666709
---
# Dataset Card for "fwv2_squad_rare_tip_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_random_train_100_eval_100 | 2023-08-30T09:45:33.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 17246
num_examples: 100
- name: eval_find_word
num_bytes: 17146
num_examples: 100
- name: validation
num_bytes: 17146
num_examples: 100
download_size: 34351
dataset_size: 51538
---
# Dataset Card for "fwv2_baseline_random_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_random_train_1000_eval_100 | 2023-08-30T09:46:50.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 178392
num_examples: 1000
- name: eval_find_word
num_bytes: 17146
num_examples: 100
- name: validation
num_bytes: 17146
num_examples: 100
download_size: 84243
dataset_size: 212684
---
# Dataset Card for "fwv2_baseline_random_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_baseline_random_train_10000_eval_100 | 2023-08-30T09:47:09.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1784070
num_examples: 10000
- name: eval_find_word
num_bytes: 17146
num_examples: 100
- name: validation
num_bytes: 17146
num_examples: 100
download_size: 834698
dataset_size: 1818362
---
# Dataset Card for "fwv2_baseline_random_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BensonZhang/Laion-1M-parquet | 2023-08-30T07:35:53.000Z | [
"region:us"
] | BensonZhang | null | null | null | 0 | 0 | Entry not found |
gagan3012/WizardLM_evol_instruct_70k_clustered | 2023-08-30T07:32:01.000Z | [
"region:us"
] | gagan3012 | null | null | null | 0 | 0 | Entry not found |
Monocrat/Cancanneed | 2023-08-30T07:52:33.000Z | [
"license:other",
"region:us"
] | Monocrat | null | null | null | 0 | 0 | ---
license: other
---
|
danjacobellis/imagenet_RDAE_batched_250k | 2023-08-30T22:55:35.000Z | [
"region:us"
] | danjacobellis | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: compressed_batch
sequence: binary
- name: label
sequence: int64
- name: latent_size
sequence: int64
splits:
- name: train
num_bytes: 649497545
num_examples: 2062
- name: test
num_bytes: 36909237
num_examples: 117
download_size: 639261596
dataset_size: 686406782
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "imagenet_RDAE_batched_250k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qq371/PerfectWorld | 2023-08-30T09:11:42.000Z | [
"region:us"
] | qq371 | null | null | null | 0 | 0 | Entry not found |
OzoneAsai/picts | 2023-08-30T07:49:38.000Z | [
"region:us"
] | OzoneAsai | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA | 2023-08-30T07:54:46.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/QuantumLM-llama2-70B-Korean-LoRA](https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T07:53:24.183560](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA/blob/main/results_2023-08-30T07%3A53%3A24.183560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6934168799483462,\n\
\ \"acc_stderr\": 0.03115919348812645,\n \"acc_norm\": 0.6971494359890498,\n\
\ \"acc_norm_stderr\": 0.031131669600877022,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5608488880093394,\n\
\ \"mc2_stderr\": 0.014874770245335572\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729119,\n\
\ \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.013318528460539422\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6743676558454491,\n\
\ \"acc_stderr\": 0.004676529200753001,\n \"acc_norm\": 0.8638717386974706,\n\
\ \"acc_norm_stderr\": 0.0034222387022263645\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8451612903225807,\n\
\ \"acc_stderr\": 0.020579287326583227,\n \"acc_norm\": 0.8451612903225807,\n\
\ \"acc_norm_stderr\": 0.020579287326583227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360756,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360756\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7256410256410256,\n \"acc_stderr\": 0.022622765767493225,\n\
\ \"acc_norm\": 0.7256410256410256,\n \"acc_norm_stderr\": 0.022622765767493225\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7857142857142857,\n \"acc_stderr\": 0.026653531596715484,\n\
\ \"acc_norm\": 0.7857142857142857,\n \"acc_norm_stderr\": 0.026653531596715484\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.013708749534172636,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.013708749534172636\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8734177215189873,\n \"acc_stderr\": 0.02164419572795517,\n \"\
acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.02164419572795517\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8531289910600255,\n\
\ \"acc_stderr\": 0.012658201736147278,\n \"acc_norm\": 0.8531289910600255,\n\
\ \"acc_norm_stderr\": 0.012658201736147278\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5094972067039106,\n\
\ \"acc_stderr\": 0.01671948464334877,\n \"acc_norm\": 0.5094972067039106,\n\
\ \"acc_norm_stderr\": 0.01671948464334877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.022021366100220194,\n\
\ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.022021366100220194\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.02963483847376601,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.02963483847376601\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n\
\ \"acc_stderr\": 0.012689708167787677,\n \"acc_norm\": 0.5560625814863103,\n\
\ \"acc_norm_stderr\": 0.012689708167787677\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7630718954248366,\n \"acc_stderr\": 0.01720166216978977,\n \
\ \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.01720166216978977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.0267114305555384,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.0267114305555384\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5608488880093394,\n\
\ \"mc2_stderr\": 0.014874770245335572\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|arc:challenge|25_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hellaswag|10_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T07:53:24.183560.parquet'
- config_name: results
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- results_2023-08-30T07:53:24.183560.parquet
- split: latest
path:
- results_2023-08-30T07:53:24.183560.parquet
---
# Dataset Card for Evaluation run of quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/QuantumLM-llama2-70B-Korean-LoRA](https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T07:53:24.183560](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA/blob/main/results_2023-08-30T07%3A53%3A24.183560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6934168799483462,
"acc_stderr": 0.03115919348812645,
"acc_norm": 0.6971494359890498,
"acc_norm_stderr": 0.031131669600877022,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5608488880093394,
"mc2_stderr": 0.014874770245335572
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729119,
"acc_norm": 0.7056313993174061,
"acc_norm_stderr": 0.013318528460539422
},
"harness|hellaswag|10": {
"acc": 0.6743676558454491,
"acc_stderr": 0.004676529200753001,
"acc_norm": 0.8638717386974706,
"acc_norm_stderr": 0.0034222387022263645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.02555992055053101,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.02555992055053101
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360756,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360756
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7256410256410256,
"acc_stderr": 0.022622765767493225,
"acc_norm": 0.7256410256410256,
"acc_norm_stderr": 0.022622765767493225
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7857142857142857,
"acc_stderr": 0.026653531596715484,
"acc_norm": 0.7857142857142857,
"acc_norm_stderr": 0.026653531596715484
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.013708749534172636,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.013708749534172636
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.02164419572795517,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.02164419572795517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8531289910600255,
"acc_stderr": 0.012658201736147278,
"acc_norm": 0.8531289910600255,
"acc_norm_stderr": 0.012658201736147278
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5094972067039106,
"acc_stderr": 0.01671948464334877,
"acc_norm": 0.5094972067039106,
"acc_norm_stderr": 0.01671948464334877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.022021366100220194,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.022021366100220194
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.02963483847376601,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.02963483847376601
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5560625814863103,
"acc_stderr": 0.012689708167787677,
"acc_norm": 0.5560625814863103,
"acc_norm_stderr": 0.012689708167787677
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.01720166216978977,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.01720166216978977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.0267114305555384,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.0267114305555384
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5608488880093394,
"mc2_stderr": 0.014874770245335572
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AIFreshmen/c4 | 2023-08-30T07:53:57.000Z | [
"region:us"
] | AIFreshmen | null | null | null | 0 | 0 | Entry not found |
sidharthsingh1892/test_data_1 | 2023-08-30T08:02:23.000Z | [
"region:us"
] | sidharthsingh1892 | null | null | null | 0 | 0 | Entry not found |
sidharthsingh1892/test_data_3 | 2023-08-30T08:07:14.000Z | [
"region:us"
] | sidharthsingh1892 | null | null | null | 0 | 0 | Entry not found |
hecool108/ctee-p1 | 2023-08-30T08:20:19.000Z | [
"region:us"
] | hecool108 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 222513615.0
num_examples: 744
download_size: 221812729
dataset_size: 222513615.0
---
# Dataset Card for "ctee-p1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
prashantkambi/datasetdescription | 2023-09-11T11:13:02.000Z | [
"region:us"
] | prashantkambi | null | null | null | 0 | 0 | Entry not found |
sidharthsingh1892/test_data_4 | 2023-08-30T08:24:43.000Z | [
"region:us"
] | sidharthsingh1892 | null | null | null | 0 | 0 | Entry not found |
8bitkick/bbc-micro-manual-test | 2023-08-30T22:22:53.000Z | [
"region:us"
] | 8bitkick | null | null | null | 0 | 0 | Entry not found |
AIFreshmen/c444 | 2023-08-30T08:32:13.000Z | [
"region:us"
] | AIFreshmen | null | null | null | 0 | 0 | Entry not found |
pavithrav/testing2 | 2023-08-30T08:38:33.000Z | [
"region:us"
] | pavithrav | null | null | null | 0 | 0 | Entry not found |
dongyoung4091/shp_with_features_20k_rx_tmp | 2023-08-30T08:39:30.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | Entry not found |
ocvawes/ocvawes | 2023-08-30T08:58:09.000Z | [
"region:us"
] | ocvawes | null | null | null | 0 | 0 | Entry not found |
AnimaleMEGummiesAustralia/AnimaleMaleEnhancementAustralia | 2023-08-30T08:54:04.000Z | [
"region:us"
] | AnimaleMEGummiesAustralia | null | null | null | 0 | 0 | <h2><span style="background-color: #ffff00;"><strong>Our Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="font-weight: 400;">➥ Product Name — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> </span></span></h3>
<h3><span style="font-weight: 400;">➥ Country — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Australia</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Main Benefits — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Male Enhancement</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Rating —</span> <span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>5.0/5.0</strong></a></span><span style="font-weight: 400;"> ⭐⭐⭐⭐⭐</span></h3>
<h3><span style="font-weight: 400;">➥ Results — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>In 1-3 Months</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Availability — </span><span style="color: #800000;"><a style="color: #800000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Online</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Side Effects — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>No Major Side Effects</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Official Website (Sale Is Live) — </span><span style="color: #993366;"><a style="color: #993366;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Click Here To Order Animale Male Enhancement Australia</strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">From cutting-edge supplements to innovative training methods, we will cover everything you need to know to take your performance to the next level. So, if you're ready to up your game, read on to discover this best performance enhancers to try in 2023.</span></p>
<p> </p>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> Supplement Reviews, : In today's world, people are constantly seeking ways to improve their performance and achieve their goals. Whether it's in sports, academics, or professional life, performance enhancement is a hot topic. With the advent of new technology and research, there are a plethora of options available for those looking to boost their performance. In this article, we will explore the best performance enhancers, Animale Nitric Oxide to try in 2023. From cutting-edge supplements to innovative training methods, we will cover everything you need to know to take your performance to the next level. So, if you're ready to up your game, read on to discover this best performance enhancers to try in 2023. Let’s have a look! </span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>About Animale Male Enhancement Australia: </strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is a performance enhancer that has been gaining popularity in the fitness industry. It is designed to increase the levels of nitric oxide in the body, which is a naturally occurring compound that plays a crucial role in promoting cardiovascular health and enhancing physical performance. Nitric oxide helps to dilate blood vessels, allowing more blood to flow to the muscles, which in turn leads to increased endurance, strength, and power. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/MD244jJ/dfggvgfdgfg.png" alt="dfggvgfdgfg" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<p><span style="font-weight: 400;">The </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> contains a blend of ingredients that work synergistically to increase nitric oxide levels in the body. The supplement also contains other ingredients that are known to have performance-enhancing properties. These in return helps to increase endurance and reduce fatigue by buffering the build-up of lactic acid in the muscles. </span></p>
<p> </p>
<p><span style="font-weight: 400;">The </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is designed to be taken before workouts, and it is recommended to take two capsules per day. </span></p>
<p> </p>
<p><span style="font-weight: 400;">One of the biggest advantages of the </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is that it is a natural supplement, meaning that it does not contain any artificial ingredients or harmful chemicals. It is also free from caffeine and other stimulants, making it a great option for those who are sensitive to these substances. </span></p>
<p> </p>
<p><span style="font-weight: 400;">In addition to its performance-enhancing properties, the </span><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Muscle Building Formula</strong></a><span style="font-weight: 400;"> is also believed to have a number of health benefits. Studies have shown that increasing nitric oxide levels in the body can help to lower blood pressure, improve cardiovascular health, and boost cognitive function. The Animale Muscle Building supplement available for sale in the , Australia, UK, </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Australia</strong></a><span style="font-weight: 400;">, New Zealand, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, Belize, Japan, Türkiye etc. </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ MUST SEE: (SPECIAL SAVINGS) Click Here to Get Animale Male Enhancement Australia For an Exclusive Discounted Price!!</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Benefits of taking Animale Male Enhancement Australia Supplement: </strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is a dietary supplement designed to enhance athletic performance by boosting nitric oxide production in the body. Here are six benefits of taking Animale Male Enhancement Australia </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Increased Endurance and Stamina: </strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Animale Male Enhancement Australia NZ</strong></a><span style="font-weight: 400;"> can help increase your endurance and stamina by improving blood flow to your muscles. This helps deliver more oxygen and nutrients to your muscles, allowing you to work out harder and for longer periods of time. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Faster Recovery: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">The increased blood flow and nutrient delivery provided by </span><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> can also help speed up your recovery time after intense exercise. This means you can get back to your workouts sooner and make more progress in less time. </span></p>
<p> </p>
<h3><span style="background-color: #ffff00; color: #000000;"><a style="background-color: #ffff00; color: #000000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Improved Muscle Growth: </strong></a></span></h3>
<h3> </h3>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> supplement can help stimulate muscle growth by improving nutrient uptake and oxygen delivery to your muscles. This means you can build muscle more efficiently and see results faster. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/qWVMqDC/Animale-CBD-Gummies-Au.png" alt="Animale-CBD-Gummies-Au" border="0" /></a><br /><br /></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (HUGE SAVINGS TODAY) Click Here to Buy Animale Male Enhancement Australia For The Current Most Discounted Price Today!!</strong></a></span></h1>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Increased Strength: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">By improving blood flow and nutrient delivery to your muscles, </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Muscle Building supplement</strong> </a><span style="font-weight: 400;">can help increase your strength and power. This can translate to better performance in sports and other physical activities. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Enhanced Mental Focus: </strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>“Animale Male Enhancement Australia”</strong></a> <span style="font-weight: 400;">can also help improve your mental focus and clarity. This can help you stay motivated and focused during your workouts, which can lead to better results. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Cardiovascular Health: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Nitric oxide has been shown to have a positive impact on cardiovascular health. By improving blood flow and oxygen delivery to the heart and other organs, </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> may help reduce the risk of heart disease and other cardiovascular conditions. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Conditions take place due to low testosterone: </strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Low testosterone levels,</strong></a><span style="font-weight: 400;"> also known as hypogonadism, can lead to a variety of physical and psychological symptoms in men. Testosterone is a hormone that is responsible for the development of male physical characteristics, such as the growth of facial and body hair, muscle mass, and a deeper voice. It also plays a role in regulating mood, energy levels, and cognitive function. Here are seven conditions that can take place due to low testosterone: </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Fatigue and low energy levels: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone plays a role in regulating energy levels and fatigue. Men with low testosterone levels may experience a lack of energy, increased fatigue, and decreased motivation. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Decreased muscle mass and strength: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone is important for building and maintaining muscle mass and strength. Men with low testosterone levels may experience a decrease in muscle mass and strength. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Increased body fat: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone helps regulate fat distribution in the body. Men with low testosterone levels may experience an increase in body fat, particularly in the abdominal area. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Mood changes: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone plays a role in regulating mood and emotional well-being. Men with low testosterone levels may experience mood swings, irritability, and depression. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Decreased bone density: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone helps maintain bone density, so men with low testosterone levels may experience a decrease in bone density and an increased risk of osteoporosis. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/qdXdYdY/Animale-CBD-Gummies.png" alt="Animale-CBD-Gummies" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Male Enhancement Australia Prices </strong></a></span></h2>
<p> </p>
<p><span style="background-color: #00ffff;"><strong>The cost of </strong><a style="background-color: #00ffff;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Male Enhancement Australia</strong></a><strong> is low in compare of other Male Enhancement supplements. Check the price list below: </strong></span></p>
<p> </p>
<p><span style="color: #993300; background-color: #ccffff;"><a style="color: #993300; background-color: #ccffff;" href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Buy 3 Get 2 Free* - $39.95 per bottle </strong></a></span></p>
<p> </p>
<p><span style="color: #993300; background-color: #ccffff;"><a style="color: #993300; background-color: #ccffff;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Buy 2 Get 1 Free* - $49.95 per bottle </strong></a></span></p>
<p> </p>
<p><span style="color: #993300; background-color: #ccffff;"><a style="color: #993300; background-color: #ccffff;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Buy 1 Bottle - $69.95 per bottle </strong></a></span></p>
<p> </p>
<p><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>* at retail price </strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">This </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> available for sale in the Belize, </span><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Australia</strong></a><span style="font-weight: 400;">, New Zealand, Japan, Türkiye, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, , Australia, UK etc. </span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>FAQ: </strong></a></span></h2>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Is taking performance enhancers safe for men? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Taking performance enhancers can be unsafe if not taken responsibly and under the guidance of a medical professional. It is important to be aware of any health concerns that could be associated with taking any such supplements. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>What are the benefits of taking performance enhancers for men? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">The potential benefits of taking performance enhancers include increased muscle mass, strength and power, improved performance, improved recovery, and enhanced endurance. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Are there any side effects associated with taking performance enhancers? </strong></a></span></h3>
<p> </p>
<p><strong>Yes,</strong><span style="font-weight: 400;"> there can be certain side effects associated with taking performance enhancers such as increased blood pressure and cholesterol, liver damage, and potential interference with natural hormone production. </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ MUST SEE: (SPECIAL SAVINGS) Click Here to Get Animale Male Enhancement Australia For an Exclusive Discounted Price!!</strong></a></span></h1>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Are there any long-term risks associated with the use of performance enhancers? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Yes, there can be potential long-term health risks associated with taking performance enhancers, including increased risk of developing certain types of cancer. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Can performance enhancers increase a man’s risk of injury? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Yes, taking performance enhancers can increase a man’s risk of injury, as well as increase their risk of developing chronic conditions such as heart disease and stroke. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Where to Buy Animale Male Enhancement Australia? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">The Male Enhancement is available for sale on the Official Website of </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Animale Male Enhancement Australia</strong></a><strong>.</strong><span style="font-weight: 400;"> This Animale Male Enhancement Australia available for sale in the Australia, New Zealand, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, , Australia, UK, Belize, Japan, Türkiye etc. </span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Conclusion: </strong></a></span></h2>
<p> </p>
<p><span style="font-weight: 400;">In conclusion, the </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is a safe and effective supplement that can help to enhance physical performance, promote cardiovascular health, and improve overall well-being. If you're looking for a natural way to boost your workouts and take your fitness to the next level in 2023, this supplement is definitely worth trying. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/zGS6kx8/man-woman-in-bed.jpg" alt="man-woman-in-bed" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (HUGE SAVINGS TODAY) Click Here to Buy Animale Male Enhancement Australia For The Current Most Discounted Price Today!!</strong></a></span></h1>
<p> </p>
<p><span style="font-weight: 400;">It is advisable that have a clear conversation with the doctor so that you can integrate it in your schedule without facing any problem. This </span><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> available for sale in the Türkiye, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, Australia, UK, Belize, Australia, New Zealand, Japan etc. </span></p>
<p> </p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Affiliate Disclosure :</strong></a><span style="font-weight: 400;"> The links contained in this product review may result in a small commission if you opt to purchase the product recommended at no additional cost to you. This goes towards supporting our research and editorial team. Please know we only recommend high-quality products. </span></p>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Disclaimer :</strong></a><span style="font-weight: 400;"> Please understand that any advice or guidelines revealed here are not even remotely substitutes for sound medical or financial advice from a licensed healthcare provider or certified financial advisor. Make sure to consult with a professional physician or financial consultant before making any purchasing decision if you use medications or have concerns following the review details shared above. Individual results may vary and are not guaranteed as the statements regarding these products have not been evaluated by the Food and Drug Administration or </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Health Australia</strong></a><span style="font-weight: 400;">. The efficacy of these products has not been confirmed by FDA, or Health Australia approved research. These products are not intended to diagnose, treat, cure or prevent any disease and do not provide any kind of get-rich money scheme. Reviewer is not responsible for pricing inaccuracies. </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Check product sales page for final prices.</strong></a><span style="font-weight: 400;"> </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Our Official Blogs ⇒</strong></span></h2>
<p><a href="https://animale-male-enhancement-in-australia.mystrikingly.com/"><strong>https://animale-male-enhancement-in-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.mystrikingly.com/"><strong>https://animale-male-enhancement-in-au.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia.mystrikingly.com/"><strong>https://animale-cbd-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.mystrikingly.com/"><strong>https://animale-me-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementaustr119.godaddysites.com/"><strong>https://animalemaleenhancementaustr119.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementau3.godaddysites.com/"><strong>https://animalemaleenhancementau3.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiesaustralia.godaddysites.com/"><strong>https://animalecbdgummiesaustralia.godaddysites.com/</strong></a></p>
<p><a href="https://animalemegummiesaustralia.godaddysites.com/"><strong>https://animalemegummiesaustralia.godaddysites.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-australia-10.jimdosite.com/"><strong>https://animale-male-enhancement-australia-10.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-au-2.jimdosite.com/"><strong>https://animale-male-enhancement-au-2.jimdosite.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia-15.jimdosite.com/"><strong>https://animale-cbd-gummies-australia-15.jimdosite.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.jimdosite.com/"><strong>https://animale-me-gummies-australia.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-austr-b38ba9.webflow.io/"><strong>https://animale-male-enhancement-austr-b38ba9.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-au-e71bbc.webflow.io/"><strong>https://animale-male-enhancement-au-e71bbc.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia-a505bb.webflow.io/"><strong>https://animale-cbd-gummies-australia-a505bb.webflow.io/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.webflow.io/"><strong>https://animale-me-gummies-australia.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-australia.company.site/"><strong>https://animale-male-enhancement-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.company.site/"><strong>https://animale-male-enhancement-in-au.company.site/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.company.site/"><strong>https://animale-cbd-gummies-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.company.site/"><strong>https://animale-me-gummies-australia.company.site/</strong></a></p>
<p><a href="https://animale-maleenhancement-australia.jigsy.com/"><strong>https://animale-maleenhancement-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-maleenhancement-au.jigsy.com/"><strong>https://animale-maleenhancement-au.jigsy.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.jigsy.com/"><strong>https://animale-cbd-gummies-in-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.jigsy.com/"><strong>https://animale-me-gummies-australia.jigsy.com/</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html</strong></a></p>
<p><a href="https://sites.google.com/view/animalemale-enhancement-au/"><strong>https://sites.google.com/view/animalemale-enhancement-au/</strong></a></p>
<p><a href="https://sites.google.com/view/animalecbd-gummies-australia/"><strong>https://sites.google.com/view/animalecbd-gummies-australia/</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D"><strong>https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-"><strong>https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD"><strong>https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD"><strong>https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E</strong></a></p>
<p><strong><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA">https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay & Venezuela Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><strong><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/">https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa & Malaysia Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><strong><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/">https://www.facebook.com/AnimaleMaleEnhancementInMY/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Sweden Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSE/"><strong>https://www.facebook.com/KetoXplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoXplodeGummiesInSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesInSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSE/"><strong>https://www.facebook.com/KetoExplodeGummiesSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSE/"><strong>https://www.facebook.com/KetoExplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesInSweden/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Finland Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesFI/"><strong>https://www.facebook.com/KetoXplodeGummiesFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInFI/"><strong>https://www.facebook.com/KetoXplodeGummiesInFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesFinland/"><strong>https://www.facebook.com/KetoXplodeGummiesFinland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInFinland/"><strong>https://www.facebook.com/KetoXplodeGummiesInFinland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesFI/"><strong>https://www.facebook.com/KetoExplodeGummiesFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInFI/"><strong>https://www.facebook.com/KetoExplodeGummiesInFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesFinland/"><strong>https://www.facebook.com/KetoExplodeGummiesFinland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInFinland/"><strong>https://www.facebook.com/KetoExplodeGummiesInFinland/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><a href="https://www.facebook.com/ViagraKaufen/"><strong>https://www.facebook.com/ViagraKaufen/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Active Keto Gummies Israel Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInIsrael/"><strong>https://www.facebook.com/ActiveKetoGummiesInIsrael/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInIL/"><strong>https://www.facebook.com/ActiveKetoGummiesInIL/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesIsraelWebsite/"><strong>https://www.facebook.com/ActiveKetoGummiesIsraelWebsite/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesILBuy/"><strong>https://www.facebook.com/ActiveKetoGummiesILBuy/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesILOfficial/"><strong>https://www.facebook.com/ActiveKetoGummiesILOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesIsraelOfficial/"><strong>https://www.facebook.com/ActiveKetoGummiesIsraelOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesILShop/"><strong>https://www.facebook.com/ActiveKetoGummiesILShop/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesIsraelShop/"><strong>https://www.facebook.com/ActiveKetoGummiesIsraelShop/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Recent Searches : </strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustralia</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAU</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaGrab</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaPriceAtClicks</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaShopNow</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/1000934873231"><strong>#AnimaleMaleEnhancementAustraliaOffer</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBenefits</strong> </a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#Animal Male Enhancement Australia Scam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaLegit</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaSexBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaPenisEnlargement</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>#AnimaleMaleEnhancementAustraliaStaminaBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementGummiesUruguay</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaReview</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementGummiesAU</strong></a></p> |
AnimaleMEGummiesAustralia/AnimaleMaleEnhancementGummiesCapsulesAustralia | 2023-08-30T08:55:12.000Z | [
"region:us"
] | AnimaleMEGummiesAustralia | null | null | null | 0 | 0 | <h2><span style="background-color: #ffff00;"><strong>Our Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="font-weight: 400;">➥ Product Name — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> </span></span></h3>
<h3><span style="font-weight: 400;">➥ Country — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Australia</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Main Benefits — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Male Enhancement</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Rating —</span> <span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>5.0/5.0</strong></a></span><span style="font-weight: 400;"> ⭐⭐⭐⭐⭐</span></h3>
<h3><span style="font-weight: 400;">➥ Results — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>In 1-3 Months</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Availability — </span><span style="color: #800000;"><a style="color: #800000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Online</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Side Effects — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>No Major Side Effects</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Official Website (Sale Is Live) — </span><span style="color: #993366;"><a style="color: #993366;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Click Here To Order Animale Male Enhancement Australia</strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">From cutting-edge supplements to innovative training methods, we will cover everything you need to know to take your performance to the next level. So, if you're ready to up your game, read on to discover this best performance enhancers to try in 2023.</span></p>
<p> </p>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Our Official Blogs ⇒</strong></span></h2>
<p><a href="https://animale-male-enhancement-in-australia.mystrikingly.com/"><strong>https://animale-male-enhancement-in-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.mystrikingly.com/"><strong>https://animale-male-enhancement-in-au.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia.mystrikingly.com/"><strong>https://animale-cbd-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.mystrikingly.com/"><strong>https://animale-me-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementaustr119.godaddysites.com/"><strong>https://animalemaleenhancementaustr119.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementau3.godaddysites.com/"><strong>https://animalemaleenhancementau3.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiesaustralia.godaddysites.com/"><strong>https://animalecbdgummiesaustralia.godaddysites.com/</strong></a></p>
<p><a href="https://animalemegummiesaustralia.godaddysites.com/"><strong>https://animalemegummiesaustralia.godaddysites.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-australia-10.jimdosite.com/"><strong>https://animale-male-enhancement-australia-10.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-au-2.jimdosite.com/"><strong>https://animale-male-enhancement-au-2.jimdosite.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia-15.jimdosite.com/"><strong>https://animale-cbd-gummies-australia-15.jimdosite.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.jimdosite.com/"><strong>https://animale-me-gummies-australia.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-austr-b38ba9.webflow.io/"><strong>https://animale-male-enhancement-austr-b38ba9.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-au-e71bbc.webflow.io/"><strong>https://animale-male-enhancement-au-e71bbc.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia-a505bb.webflow.io/"><strong>https://animale-cbd-gummies-australia-a505bb.webflow.io/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.webflow.io/"><strong>https://animale-me-gummies-australia.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-australia.company.site/"><strong>https://animale-male-enhancement-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.company.site/"><strong>https://animale-male-enhancement-in-au.company.site/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.company.site/"><strong>https://animale-cbd-gummies-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.company.site/"><strong>https://animale-me-gummies-australia.company.site/</strong></a></p>
<p><a href="https://animale-maleenhancement-australia.jigsy.com/"><strong>https://animale-maleenhancement-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-maleenhancement-au.jigsy.com/"><strong>https://animale-maleenhancement-au.jigsy.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.jigsy.com/"><strong>https://animale-cbd-gummies-in-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.jigsy.com/"><strong>https://animale-me-gummies-australia.jigsy.com/</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html</strong></a></p>
<p><a href="https://sites.google.com/view/animalemale-enhancement-au/"><strong>https://sites.google.com/view/animalemale-enhancement-au/</strong></a></p>
<p><a href="https://sites.google.com/view/animalecbd-gummies-australia/"><strong>https://sites.google.com/view/animalecbd-gummies-australia/</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D"><strong>https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-"><strong>https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD"><strong>https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD"><strong>https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E</strong></a></p>
<p><strong><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA">https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay & Venezuela Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><strong><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/">https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa & Malaysia Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><strong><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/">https://www.facebook.com/AnimaleMaleEnhancementInMY/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Sweden Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSE/"><strong>https://www.facebook.com/KetoXplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoXplodeGummiesInSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesInSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSE/"><strong>https://www.facebook.com/KetoExplodeGummiesSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSE/"><strong>https://www.facebook.com/KetoExplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesInSweden/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Finland Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesFI/"><strong>https://www.facebook.com/KetoXplodeGummiesFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInFI/"><strong>https://www.facebook.com/KetoXplodeGummiesInFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesFinland/"><strong>https://www.facebook.com/KetoXplodeGummiesFinland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInFinland/"><strong>https://www.facebook.com/KetoXplodeGummiesInFinland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesFI/"><strong>https://www.facebook.com/KetoExplodeGummiesFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInFI/"><strong>https://www.facebook.com/KetoExplodeGummiesInFI/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesFinland/"><strong>https://www.facebook.com/KetoExplodeGummiesFinland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInFinland/"><strong>https://www.facebook.com/KetoExplodeGummiesInFinland/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><a href="https://www.facebook.com/ViagraKaufen/"><strong>https://www.facebook.com/ViagraKaufen/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Active Keto Gummies Israel Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInIsrael/"><strong>https://www.facebook.com/ActiveKetoGummiesInIsrael/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInIL/"><strong>https://www.facebook.com/ActiveKetoGummiesInIL/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesIsraelWebsite/"><strong>https://www.facebook.com/ActiveKetoGummiesIsraelWebsite/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesILBuy/"><strong>https://www.facebook.com/ActiveKetoGummiesILBuy/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesILOfficial/"><strong>https://www.facebook.com/ActiveKetoGummiesILOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesIsraelOfficial/"><strong>https://www.facebook.com/ActiveKetoGummiesIsraelOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesILShop/"><strong>https://www.facebook.com/ActiveKetoGummiesILShop/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesIsraelShop/"><strong>https://www.facebook.com/ActiveKetoGummiesIsraelShop/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Recent Searches : </strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustralia</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAU</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaGrab</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaPriceAtClicks</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaShopNow</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/1000934873231"><strong>#AnimaleMaleEnhancementAustraliaOffer</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBenefits</strong> </a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#Animal Male Enhancement Australia Scam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaLegit</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaSexBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaPenisEnlargement</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>#AnimaleMaleEnhancementAustraliaStaminaBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementGummiesUruguay</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaReview</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementGummiesAU</strong></a></p> |
datascience555/mock1 | 2023-08-31T06:14:14.000Z | [
"region:us"
] | datascience555 | null | null | null | 0 | 0 | Entry not found |
MaggiePai/narrativeQA-test-CODE | 2023-08-31T08:44:38.000Z | [
"region:us"
] | MaggiePai | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.