datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B | ---
pretty_name: Evaluation run of delayedkarma/NeuralHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [delayedkarma/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/delayedkarma/NeuralHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T07:55:22.452252](https://huggingface.co/datasets/open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-03-10T07-55-22.452252.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.631917558240621,\n\
\ \"acc_stderr\": 0.032259628416007755,\n \"acc_norm\": 0.6382580775427679,\n\
\ \"acc_norm_stderr\": 0.03290869969358224,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5383196217878969,\n\
\ \"mc2_stderr\": 0.015342261550018428\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6605257916749652,\n\
\ \"acc_stderr\": 0.004725630911520331,\n \"acc_norm\": 0.8500298745269866,\n\
\ \"acc_norm_stderr\": 0.0035631244274585212\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568532,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568532\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513537,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513537\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437413,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437413\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n\
\ \"acc_stderr\": 0.015506892594647267,\n \"acc_norm\": 0.3128491620111732,\n\
\ \"acc_norm_stderr\": 0.015506892594647267\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.524822695035461,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045699,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045699\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396546,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396546\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5383196217878969,\n\
\ \"mc2_stderr\": 0.015342261550018428\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34040940106141016,\n \
\ \"acc_stderr\": 0.013052097103299099\n }\n}\n```"
repo_url: https://huggingface.co/delayedkarma/NeuralHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|winogrande|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T07-55-22.452252.parquet'
- config_name: results
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- results_2024-03-10T07-55-22.452252.parquet
- split: latest
path:
- results_2024-03-10T07-55-22.452252.parquet
---
# Dataset Card for Evaluation run of delayedkarma/NeuralHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [delayedkarma/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/delayedkarma/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T07:55:22.452252](https://huggingface.co/datasets/open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-03-10T07-55-22.452252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.631917558240621,
"acc_stderr": 0.032259628416007755,
"acc_norm": 0.6382580775427679,
"acc_norm_stderr": 0.03290869969358224,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5383196217878969,
"mc2_stderr": 0.015342261550018428
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111726,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441374
},
"harness|hellaswag|10": {
"acc": 0.6605257916749652,
"acc_stderr": 0.004725630911520331,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.0035631244274585212
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568532,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513537,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437413,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437413
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3128491620111732,
"acc_stderr": 0.015506892594647267,
"acc_norm": 0.3128491620111732,
"acc_norm_stderr": 0.015506892594647267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045699,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045699
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396546,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396546
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5383196217878969,
"mc2_stderr": 0.015342261550018428
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089688
},
"harness|gsm8k|5": {
"acc": 0.34040940106141016,
"acc_stderr": 0.013052097103299099
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Meranti/CLAP_freesound | ---
task_categories:
- audio-classification
language:
- en
tags:
- audio
- text
- contrastive learning
pretty_name: freesound
size_categories:
- 1M<n<10M
---
# LAION-Audio-630K Freesound Dataset
[LAION-Audio-630K](https://github.com/LAION-AI/audio-dataset/blob/main/laion-audio-630k/README.md) is the largest audio-text dataset publicly available and a magnitude larger than previous audio-text datasets (by 2022-11-05). Notably, it combines eight distinct datasets, which includes the Freesound dataset.
Specifically, this Hugging face repository contains two versions of Freesound dataset. Details of each dataset (e.g. how captions are made etc.) could be found in the "datacard" column of the table below.
- **Freesound (full)**: The complete Freesound dataset, available at `/freesound` folder.
- **Freesound (no overlap)**: Made based on Freesound(full), with samples from ESC50, FSD50K, Urbansound8K and Clotho removed. available at `/freesound_no_overlap` folder.
As of the structure and format of `freesound` and `freesound_no_overlap` folder, please refer to [this page](https://github.com/LAION-AI/audio-dataset/blob/main/data_preprocess/README.md).
| Name |Duration |Number of Samples |Data Type | Metadata | Data Card |
|--------------------------------------------------|-------------------------|--------------------|--------- |--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------- |
| Freesound (no overlap) |2817.31hrs | 460801 |1-2 captions per audio, audio | [website](https://freesound.org/) <br> [csv]()|[data card](/data_card/freesound.md)|
| Freesound (full) |3033.38hrs | 515581 |1-2 captions per audio, audio | [website](https://freesound.org/) <br> [csv]() |[data card](/data_card/freesound.md)|
## Metadata csv file
For each of the two datasets, we provide a metadata csv file including the following columns:
- **audio_filename**: The filename of the audio file in `.tar` files. `exemple: 2394.flac`
- **caption_i**: the i-th caption of the audio file
- **freesound_id**: The freesound id of the audio file.
- **username**: The username of the uploader of the audio file.
- **freesound_url**: The url of the audio file in freesound.org
- **username**: The freesound username of the uploader of the audio file.
- **license**: The license of the audio file. `http://creativecommons.org/licenses/by/3.0/`
## Credits & Licence
- **!!!TERM OF USE!!!**: **By downloading files in this repository, you agree that you will use them <u> for research purposes only </u>. If you want to use Freesound clips in LAION-Audio-630K for commercial purposes, please contact Frederic Font Corbera at frederic.font@upf.edu.**
### Freesound Credit:
All audio clips from Freesound are released under Creative Commons (CC) licenses, while each clip has its own license as defined by the clip uploader in Freesound, some of them requiring attribution to their original authors and some forbidding further commercial reuse. Specifically, here is the statistics about licenses of audio clips involved in LAION-Audio-630K:
| License | Number of Samples |
| :--- | :--- |
| http://creativecommons.org/publicdomain/zero/1.0/ | 260134 |
| https://creativecommons.org/licenses/by/4.0/ | 97090 |
| http://creativecommons.org/licenses/by/3.0/ | 89337 |
| http://creativecommons.org/licenses/by-nc/3.0/ | 31680 |
| https://creativecommons.org/licenses/by-nc/4.0/ | 26736 |
| http://creativecommons.org/licenses/sampling+/1.0/ | 11116 |
## Acknowledgement
The whole collection process as well as all usage of the LAION-Audio-630K are conducted by Germany non-profit pure research organization [LAION](https://laion.ai/). All contributors and collectors of the dataset are considered as open source contributors affiliated to LAION. These community contributors (Discord ids) include but not limited to: @marianna13#7139, @Chr0my#0173, @PiEquals4#1909, @Yuchen Hui#8574, @Antoniooooo#4758, @IYWO#9072, krishna#1648, @dicknascarsixtynine#3885, and @turian#1607. We would like to appreciate all of them for their efforts on the LAION-Audio-630k dataset. |
liuyanchen1015/MULTI_VALUE_qqp_drop_aux_wh | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 4830871
num_examples: 33522
- name: test
num_bytes: 47375999
num_examples: 325344
- name: train
num_bytes: 43406035
num_examples: 301000
download_size: 58949718
dataset_size: 95612905
---
# Dataset Card for "MULTI_VALUE_qqp_drop_aux_wh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rishi1entirerbb/inswapper_128.onnx | ---
license: wtfpl
---
|
AdapterOcean/datasci-standardized_embedded | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 12600352
num_examples: 1982
download_size: 6271462
dataset_size: 12600352
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "datasci-standardized_embedded"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_DreadPoor__NewtoccineLake-slerp-7B | ---
pretty_name: Evaluation run of DreadPoor/NewtoccineLake-slerp-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DreadPoor/NewtoccineLake-slerp-7B](https://huggingface.co/DreadPoor/NewtoccineLake-slerp-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DreadPoor__NewtoccineLake-slerp-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T08:47:01.875874](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__NewtoccineLake-slerp-7B/blob/main/results_2024-02-02T08-47-01.875874.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6491571860381778,\n\
\ \"acc_stderr\": 0.03206686241411434,\n \"acc_norm\": 0.6505026667143076,\n\
\ \"acc_norm_stderr\": 0.03271587226996518,\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5995154555010791,\n\
\ \"mc2_stderr\": 0.015251277020194909\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6527303754266212,\n \"acc_stderr\": 0.01391303452962045,\n\
\ \"acc_norm\": 0.6868600682593856,\n \"acc_norm_stderr\": 0.013552671543623494\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6796454889464251,\n\
\ \"acc_stderr\": 0.004656591678606763,\n \"acc_norm\": 0.859788886675961,\n\
\ \"acc_norm_stderr\": 0.0034649633793799386\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.02247325333276875,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.02247325333276875\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781874,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7331189710610932,\n\
\ \"acc_stderr\": 0.02512263760881666,\n \"acc_norm\": 0.7331189710610932,\n\
\ \"acc_norm_stderr\": 0.02512263760881666\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n\
\ \"acc_stderr\": 0.012757683047716175,\n \"acc_norm\": 0.47783572359843546,\n\
\ \"acc_norm_stderr\": 0.012757683047716175\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n\
\ \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5995154555010791,\n\
\ \"mc2_stderr\": 0.015251277020194909\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156876\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6178923426838514,\n \
\ \"acc_stderr\": 0.013384173935648492\n }\n}\n```"
repo_url: https://huggingface.co/DreadPoor/NewtoccineLake-slerp-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|arc:challenge|25_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|gsm8k|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hellaswag|10_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-47-01.875874.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T08-47-01.875874.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- '**/details_harness|winogrande|5_2024-02-02T08-47-01.875874.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T08-47-01.875874.parquet'
- config_name: results
data_files:
- split: 2024_02_02T08_47_01.875874
path:
- results_2024-02-02T08-47-01.875874.parquet
- split: latest
path:
- results_2024-02-02T08-47-01.875874.parquet
---
# Dataset Card for Evaluation run of DreadPoor/NewtoccineLake-slerp-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DreadPoor/NewtoccineLake-slerp-7B](https://huggingface.co/DreadPoor/NewtoccineLake-slerp-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DreadPoor__NewtoccineLake-slerp-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T08:47:01.875874](https://huggingface.co/datasets/open-llm-leaderboard/details_DreadPoor__NewtoccineLake-slerp-7B/blob/main/results_2024-02-02T08-47-01.875874.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6491571860381778,
"acc_stderr": 0.03206686241411434,
"acc_norm": 0.6505026667143076,
"acc_norm_stderr": 0.03271587226996518,
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5995154555010791,
"mc2_stderr": 0.015251277020194909
},
"harness|arc:challenge|25": {
"acc": 0.6527303754266212,
"acc_stderr": 0.01391303452962045,
"acc_norm": 0.6868600682593856,
"acc_norm_stderr": 0.013552671543623494
},
"harness|hellaswag|10": {
"acc": 0.6796454889464251,
"acc_stderr": 0.004656591678606763,
"acc_norm": 0.859788886675961,
"acc_norm_stderr": 0.0034649633793799386
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.02247325333276875,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.02247325333276875
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781874,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7331189710610932,
"acc_stderr": 0.02512263760881666,
"acc_norm": 0.7331189710610932,
"acc_norm_stderr": 0.02512263760881666
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291463,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291463
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716175,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716175
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5995154555010791,
"mc2_stderr": 0.015251277020194909
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156876
},
"harness|gsm8k|5": {
"acc": 0.6178923426838514,
"acc_stderr": 0.013384173935648492
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jlh/uci-adult-income | ---
dataset_info:
features:
- name: age
dtype: int64
- name: workclass
dtype: string
- name: fnlwgt
dtype: int64
- name: education
dtype: string
- name: education-num
dtype: int64
- name: marital-status
dtype: string
- name: occupation
dtype: string
- name: relationship
dtype: string
- name: race
dtype: string
- name: sex
dtype: string
- name: capital-gain
dtype: int64
- name: capital-loss
dtype: int64
- name: hours-per-week
dtype: int64
- name: native-country
dtype: string
- name: income
dtype:
class_label:
names:
'0': ' <=50K'
'1': ' >50K'
splits:
- name: train
num_bytes: 5552570
num_examples: 32561
download_size: 586658
dataset_size: 5552570
---
# Dataset Card for "uci-adult-income"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-SOLAR-Instruct | ---
pretty_name: Evaluation run of VAGOsolutions/SauerkrautLM-SOLAR-Instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [VAGOsolutions/SauerkrautLM-SOLAR-Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-SOLAR-Instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-20T21:12:30.565200](https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-SOLAR-Instruct/blob/main/results_2023-12-20T21-12-30.565200.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.665790317648132,\n\
\ \"acc_stderr\": 0.0316386595234525,\n \"acc_norm\": 0.6665591981008385,\n\
\ \"acc_norm_stderr\": 0.03228345090384577,\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7194604048636173,\n\
\ \"mc2_stderr\": 0.01493454902149839\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6868600682593856,\n \"acc_stderr\": 0.013552671543623496,\n\
\ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403518\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7173869747062338,\n\
\ \"acc_stderr\": 0.004493495872000109,\n \"acc_norm\": 0.8862776339374626,\n\
\ \"acc_norm_stderr\": 0.0031682493518893117\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.6170212765957447,\n \"acc_stderr\": 0.03177821250236922,\n \"\
acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.03177821250236922\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"\
acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172534,\n \"\
acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172534\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \"acc_norm\"\
: 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n },\n\
\ \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596915,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596915\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n\
\ \"acc_stderr\": 0.016501579306861674,\n \"acc_norm\": 0.41899441340782123,\n\
\ \"acc_norm_stderr\": 0.016501579306861674\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n\
\ \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48891786179921776,\n\
\ \"acc_stderr\": 0.01276709899852584,\n \"acc_norm\": 0.48891786179921776,\n\
\ \"acc_norm_stderr\": 0.01276709899852584\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7194604048636173,\n\
\ \"mc2_stderr\": 0.01493454902149839\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.01043091746823743\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6413949962092494,\n \
\ \"acc_stderr\": 0.01321031736413403\n }\n}\n```"
repo_url: https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|arc:challenge|25_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|gsm8k|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hellaswag|10_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-12-30.565200.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-20T21-12-30.565200.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- '**/details_harness|winogrande|5_2023-12-20T21-12-30.565200.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-20T21-12-30.565200.parquet'
- config_name: results
data_files:
- split: 2023_12_20T21_12_30.565200
path:
- results_2023-12-20T21-12-30.565200.parquet
- split: latest
path:
- results_2023-12-20T21-12-30.565200.parquet
---
# Dataset Card for Evaluation run of VAGOsolutions/SauerkrautLM-SOLAR-Instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [VAGOsolutions/SauerkrautLM-SOLAR-Instruct](https://huggingface.co/VAGOsolutions/SauerkrautLM-SOLAR-Instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-SOLAR-Instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-20T21:12:30.565200](https://huggingface.co/datasets/open-llm-leaderboard/details_VAGOsolutions__SauerkrautLM-SOLAR-Instruct/blob/main/results_2023-12-20T21-12-30.565200.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.665790317648132,
"acc_stderr": 0.0316386595234525,
"acc_norm": 0.6665591981008385,
"acc_norm_stderr": 0.03228345090384577,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7194604048636173,
"mc2_stderr": 0.01493454902149839
},
"harness|arc:challenge|25": {
"acc": 0.6868600682593856,
"acc_stderr": 0.013552671543623496,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403518
},
"harness|hellaswag|10": {
"acc": 0.7173869747062338,
"acc_stderr": 0.004493495872000109,
"acc_norm": 0.8862776339374626,
"acc_norm_stderr": 0.0031682493518893117
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596915,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596915
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861674,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861674
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7716049382716049,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.7716049382716049,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48891786179921776,
"acc_stderr": 0.01276709899852584,
"acc_norm": 0.48891786179921776,
"acc_norm_stderr": 0.01276709899852584
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7194604048636173,
"mc2_stderr": 0.01493454902149839
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.01043091746823743
},
"harness|gsm8k|5": {
"acc": 0.6413949962092494,
"acc_stderr": 0.01321031736413403
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
openaccess-ai-collective/3471ff89fc9749c5954bbfc486a015e2 | Invalid username or password. |
CyberHarem/absinthe_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of absinthe/アブサント/苦艾 (Arknights)
This is the dataset of absinthe/アブサント/苦艾 (Arknights), containing 151 images and their tags.
The core tags of this character are `animal_ears, bear_ears, red_eyes, short_hair, breasts, grey_hair, bear_girl, hairband, brown_hair, black_hairband, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 151 | 210.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/absinthe_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 151 | 180.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/absinthe_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 361 | 351.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/absinthe_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/absinthe_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, black_pantyhose, long_sleeves, solo, looking_at_viewer, necktie, simple_background, white_background, blue_skirt, full_body, black_shirt, sitting, sneakers, black_jacket, closed_mouth, bag, holding, weapon |
| 1 | 14 |  |  |  |  |  | 1girl, necktie, solo, upper_body, simple_background, looking_at_viewer, white_background, closed_mouth, long_sleeves, black_shirt, black_jacket |
| 2 | 5 |  |  |  |  |  | 1girl, 1other, blush, headpat, simple_background, solo_focus, upper_body, white_background, closed_mouth, black_jacket, black_shirt, disembodied_limb, long_sleeves, rhodes_island_logo_(arknights), blue_necktie, brown_shirt, collared_shirt, flying_sweatdrops, medium_breasts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_pantyhose | long_sleeves | solo | looking_at_viewer | necktie | simple_background | white_background | blue_skirt | full_body | black_shirt | sitting | sneakers | black_jacket | closed_mouth | bag | holding | weapon | upper_body | 1other | blush | headpat | solo_focus | disembodied_limb | rhodes_island_logo_(arknights) | blue_necktie | brown_shirt | collared_shirt | flying_sweatdrops | medium_breasts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:---------------|:-------|:--------------------|:----------|:--------------------|:-------------------|:-------------|:------------|:--------------|:----------|:-----------|:---------------|:---------------|:------|:----------|:---------|:-------------|:---------|:--------|:----------|:-------------|:-------------------|:---------------------------------|:---------------|:--------------|:-----------------|:--------------------|:-----------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | | X | X | X | X | X | X | | | X | | | X | X | | | | X | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | | | | X | X | | | X | | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-16000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 990033
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
FanChen0116/19100_chat_50x_slot_limit_pvi | ---
dataset_info:
features:
- name: id
dtype: int64
- name: tokens
sequence: string
- name: labels
sequence:
class_label:
names:
'0': O
'1': I-time
'2': B-date
'3': B-last_name
'4': B-people
'5': I-date
'6': I-people
'7': I-last_name
'8': I-first_name
'9': B-first_name
'10': B-time
- name: request_slot
sequence: string
splits:
- name: train
num_bytes: 580637
num_examples: 3200
- name: validation
num_bytes: 5405
num_examples: 32
- name: test
num_bytes: 5405
num_examples: 32
download_size: 95764
dataset_size: 591447
---
# Dataset Card for "19100_chat_50x_slot_limit_pvi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/607587de | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 184
num_examples: 10
download_size: 1337
dataset_size: 184
---
# Dataset Card for "607587de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Heba30018/final | ---
dataset_info:
features:
- name: image
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 271721.0
num_examples: 6469
download_size: 89923
dataset_size: 271721.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ranWang/books_paragraph | ---
dataset_info:
features:
- name: raw_text
dtype: string
- name: output_text
dtype: string
- name: file_name
dtype: string
- name: is_hard_linebreak
sequence: bool
splits:
- name: train
num_bytes: 2212277566
num_examples: 2982
download_size: 1336879971
dataset_size: 2212277566
---
# Dataset Card for "books_paragraph"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dap305/processed_europarlv7_subset50k | ---
dataset_info:
features:
- name: id
dtype: int64
- name: translation
struct:
- name: en
dtype: string
- name: es
dtype: string
splits:
- name: train
num_bytes: 14932116.608169874
num_examples: 46349
- name: validation
num_bytes: 786087.3918301256
num_examples: 2440
- name: test
num_bytes: 347061
num_examples: 1000
download_size: 9956095
dataset_size: 16065265
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
task_categories:
- translation
language:
- en
- es
pretty_name: EuroParl.v7.Subset50k
size_categories:
- 10K<n<100K
--- |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-132000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1006770
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ashishsr/icddxdescmap_rlhf | ---
dataset_info:
features:
- name: docdesc
dtype: string
- name: dxcode
dtype: string
- name: shortdesc
dtype: string
- name: longdesc
dtype: string
- name: dxcode_rejected
dtype: string
- name: shortdesc_rejected
dtype: string
- name: longdesc_rejected
dtype: string
splits:
- name: train
num_bytes: 187747
num_examples: 900
download_size: 63107
dataset_size: 187747
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HuggingFaceM4/debug_MathVista_mcq_to_remove | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: pid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: image
dtype: image
splits:
- name: validation
num_bytes: 53445844.44
num_examples: 540
- name: test
num_bytes: 282205467.125462
num_examples: 2852
download_size: 282919996
dataset_size: 335651311.565462
---
# Dataset Card for "debug_MathVista_mcq_to_remove"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nater-EX99/Sylphynford_Tachibana_Unprocessed | ---
license: other
---
|
kewu93/three_styles_prompted_all_512x512_excluded_training | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: style_class
dtype: string
splits:
- name: train
num_bytes: 7284057.537128714
num_examples: 300
- name: val
num_bytes: 4317601.0
num_examples: 150
download_size: 12016133
dataset_size: 11601658.537128713
---
# Dataset Card for "three_styles_prompted_all_512x512_excluded_training"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-SpaceDoge__dataset_test_1-SpaceDoge__dataset_test_1-a8c4b7-1826662824 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- SpaceDoge/dataset_test_1
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-2.7b_eval
metrics: []
dataset_name: SpaceDoge/dataset_test_1
dataset_config: SpaceDoge--dataset_test_1
dataset_split: test
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-2.7b_eval
* Dataset: SpaceDoge/dataset_test_1
* Config: SpaceDoge--dataset_test_1
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@SpaceDoge](https://huggingface.co/SpaceDoge) for evaluating this model. |
K-University/KU_Dataset | ---
license: apache-2.0
---
|
ccerpas/sentiment-banking-1 | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: 'null'
- name: metadata
struct:
- name: category
dtype: int64
- name: status
dtype: string
- name: event_timestamp
dtype: 'null'
- name: metrics
dtype: 'null'
splits:
- name: train
num_bytes: 1205760
num_examples: 5001
download_size: 448505
dataset_size: 1205760
---
# Dataset Card for "sentiment-banking-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/kaggle_females_dim_128_40k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 911798702.0
num_examples: 40000
download_size: 910126597
dataset_size: 911798702.0
---
# Dataset Card for "kaggle_females_dim_128_40k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jilp00/YouToks-Mistral-7B-Dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4244677
num_examples: 4223
download_size: 1556245
dataset_size: 4244677
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gsstein/0-percent-human-dataset-llama-og | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 87577890
num_examples: 15326
- name: test
num_bytes: 3113220
num_examples: 576
- name: validation
num_bytes: 3310694
num_examples: 576
download_size: 57427601
dataset_size: 94001804
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
liuyanchen1015/MULTI_VALUE_cola_their_they | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 191
num_examples: 3
- name: test
num_bytes: 665
num_examples: 9
- name: train
num_bytes: 4850
num_examples: 60
download_size: 8717
dataset_size: 5706
---
# Dataset Card for "MULTI_VALUE_cola_their_they"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-futin__guess-en-6f8c6a-2012266601 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: facebook/opt-125m
metrics: []
dataset_name: futin/guess
dataset_config: en
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: facebook/opt-125m
* Dataset: futin/guess
* Config: en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
growth-cadet/simplejobpost-to-signalsjson3 | ---
dataset_info:
features:
- name: ats
dtype: string
- name: context
dtype: string
- name: sys5_obj
struct:
- name: focus_areas
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: industries
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: products_and_technologies
list:
- name: description
dtype: string
- name: subject
dtype: string
- name: eval_crit
struct:
- name: focus_areas
dtype: float64
- name: industries
dtype: float64
- name: products_and_technologies
dtype: float64
- name: eval_values
struct:
- name: focus_areas
sequence: int64
- name: industries
sequence: int64
- name: products_and_technologies
sequence: int64
- name: uuid
dtype: string
- name: text3
dtype: string
splits:
- name: train
num_bytes: 55626095
num_examples: 4138
download_size: 25684738
dataset_size: 55626095
---
# Dataset Card for "simplejobpost-to-signalsjson3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigbio/progene |
---
language:
- en
bigbio_language:
- English
license: cc-by-4.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_4p0
pretty_name: ProGene
homepage: https://zenodo.org/record/3698568#.YlVHqdNBxeg
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- NAMED_ENTITY_RECOGNITION
---
# Dataset Card for ProGene
## Dataset Description
- **Homepage:** https://zenodo.org/record/3698568#.YlVHqdNBxeg
- **Pubmed:** True
- **Public:** True
- **Tasks:** NER
The Protein/Gene corpus was developed at the JULIE Lab Jena under supervision of Prof. Udo Hahn.
The executing scientist was Dr. Joachim Wermter.
The main annotator was Dr. Rico Pusch who is an expert in biology.
The corpus was developed in the context of the StemNet project (http://www.stemnet.de/).
## Citation Information
```
@inproceedings{faessler-etal-2020-progene,
title = "{P}ro{G}ene - A Large-scale, High-Quality Protein-Gene Annotated Benchmark Corpus",
author = "Faessler, Erik and
Modersohn, Luise and
Lohr, Christina and
Hahn, Udo",
booktitle = "Proceedings of the 12th Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.564",
pages = "4585--4596",
abstract = "Genes and proteins constitute the fundamental entities of molecular genetics. We here introduce ProGene (formerly called FSU-PRGE), a corpus that reflects our efforts to cope with this important class of named entities within the framework of a long-lasting large-scale annotation campaign at the Jena University Language {\&} Information Engineering (JULIE) Lab. We assembled the entire corpus from 11 subcorpora covering various biological domains to achieve an overall subdomain-independent corpus. It consists of 3,308 MEDLINE abstracts with over 36k sentences and more than 960k tokens annotated with nearly 60k named entity mentions. Two annotators strove for carefully assigning entity mentions to classes of genes/proteins as well as families/groups, complexes, variants and enumerations of those where genes and proteins are represented by a single class. The main purpose of the corpus is to provide a large body of consistent and reliable annotations for supervised training and evaluation of machine learning algorithms in this relevant domain. Furthermore, we provide an evaluation of two state-of-the-art baseline systems {---} BioBert and flair {---} on the ProGene corpus. We make the evaluation datasets and the trained models available to encourage comparable evaluations of new methods in the future.",
language = "English",
ISBN = "979-10-95546-34-4",
}
```
|
Nerfgun3/land_style | ---
language:
- en
tags:
- stable-diffusion
- text-to-image
license: creativeml-openrail-m
inference: false
---
# Landscape Style Embedding / Textual Inversion
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
Two different Versions:
### Version 1:
File: ```land_style```
To use it in a prompt: ```"art by land_style"```
For best use write something like ```highly detailed background art by land_style```
### Version 2:
File: ```landscape_style```
To use it in a prompt: ```"art by landscape_style"```
For best use write something like ```highly detailed background art by landscape_style```
If it is to strong just add [] around it.
Trained until 7000 steps
Have fun :)
## Example Pictures
<img src=https://i.imgur.com/UjoXFkJ.png width=100% height=100%/>
<img src=https://i.imgur.com/rAoEyLK.png width=100% height=100%/>
<img src=https://i.imgur.com/SpPsc7i.png width=100% height=100%/>
<img src=https://i.imgur.com/zMH0EeI.png width=100% height=100%/>
<img src=https://i.imgur.com/iQe0Jxc.png width=100% height=100%/>
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
vigneshgs7/Boundary_detection_Doc_3 | ---
dataset_info:
features:
- name: name
dtype: string
- name: uuid
dtype: string
- name: status
dtype: string
- name: image
dtype: image
- name: label.annotations
list:
- name: id
dtype: int32
- name: category_id
dtype: int32
- name: label.segmentation_bitmap
dtype: image
splits:
- name: train
num_bytes: 6567269624.0
num_examples: 132
download_size: 433493682
dataset_size: 6567269624.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_FelixChao__WizardDolphin-7B | ---
pretty_name: Evaluation run of FelixChao/WizardDolphin-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FelixChao/WizardDolphin-7B](https://huggingface.co/FelixChao/WizardDolphin-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__WizardDolphin-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T19:47:12.026725](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WizardDolphin-7B/blob/main/results_2024-01-13T19-47-12.026725.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6312916506690491,\n\
\ \"acc_stderr\": 0.0324258954325278,\n \"acc_norm\": 0.6317815176886508,\n\
\ \"acc_norm_stderr\": 0.03308456506657342,\n \"mc1\": 0.42105263157894735,\n\
\ \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5927990044155668,\n\
\ \"mc2_stderr\": 0.01547758043423419\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840055\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6707827126070504,\n\
\ \"acc_stderr\": 0.004689685978155169,\n \"acc_norm\": 0.8585939055964947,\n\
\ \"acc_norm_stderr\": 0.003477278544493499\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119667,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119667\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.037242495958177295,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.037242495958177295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.01672268452620014,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.01672268452620014\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415927,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415927\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.03957835471980979,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.03957835471980979\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257806,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257806\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3541899441340782,\n\
\ \"acc_stderr\": 0.01599564494729924,\n \"acc_norm\": 0.3541899441340782,\n\
\ \"acc_norm_stderr\": 0.01599564494729924\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n\
\ \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n\
\ \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304335,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304335\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42105263157894735,\n\
\ \"mc1_stderr\": 0.017283936248136487,\n \"mc2\": 0.5927990044155668,\n\
\ \"mc2_stderr\": 0.01547758043423419\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7853196527229677,\n \"acc_stderr\": 0.011539912734345391\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \
\ \"acc_stderr\": 0.013023665136222088\n }\n}\n```"
repo_url: https://huggingface.co/FelixChao/WizardDolphin-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|arc:challenge|25_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|gsm8k|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hellaswag|10_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T19-47-12.026725.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- '**/details_harness|winogrande|5_2024-01-13T19-47-12.026725.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T19-47-12.026725.parquet'
- config_name: results
data_files:
- split: 2024_01_13T19_47_12.026725
path:
- results_2024-01-13T19-47-12.026725.parquet
- split: latest
path:
- results_2024-01-13T19-47-12.026725.parquet
---
# Dataset Card for Evaluation run of FelixChao/WizardDolphin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/WizardDolphin-7B](https://huggingface.co/FelixChao/WizardDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__WizardDolphin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T19:47:12.026725](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__WizardDolphin-7B/blob/main/results_2024-01-13T19-47-12.026725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6312916506690491,
"acc_stderr": 0.0324258954325278,
"acc_norm": 0.6317815176886508,
"acc_norm_stderr": 0.03308456506657342,
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5927990044155668,
"mc2_stderr": 0.01547758043423419
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840055
},
"harness|hellaswag|10": {
"acc": 0.6707827126070504,
"acc_stderr": 0.004689685978155169,
"acc_norm": 0.8585939055964947,
"acc_norm_stderr": 0.003477278544493499
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.037242495958177295,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.037242495958177295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.01672268452620014,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.01672268452620014
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415927,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415927
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.03957835471980979,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.03957835471980979
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257806,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3541899441340782,
"acc_stderr": 0.01599564494729924,
"acc_norm": 0.3541899441340782,
"acc_norm_stderr": 0.01599564494729924
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304335,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304335
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42105263157894735,
"mc1_stderr": 0.017283936248136487,
"mc2": 0.5927990044155668,
"mc2_stderr": 0.01547758043423419
},
"harness|winogrande|5": {
"acc": 0.7853196527229677,
"acc_stderr": 0.011539912734345391
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222088
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FaalSa/cluster8 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 3787392
num_examples: 96
- name: validation
num_bytes: 3833472
num_examples: 96
- name: test
num_bytes: 3879552
num_examples: 96
download_size: 3984584
dataset_size: 11500416
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
kz919/open-orca-flan-50k-synthetic-reward-dpo-mistral-7b | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: task
dtype: string
- name: ignos-Mistral-T5-7B-v1
dtype: string
- name: cognAI-lil-c3po
dtype: string
- name: viethq188-Rabbit-7B-DPO-Chat
dtype: string
- name: cookinai-DonutLM-v1
dtype: string
- name: v1olet-v1olet-merged-dpo-7B
dtype: string
- name: normalized_rewards
sequence: float32
- name: router_label
dtype: int64
splits:
- name: train
num_bytes: 105157970
num_examples: 50000
download_size: 48849500
dataset_size: 105157970
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ssbuild/alpaca_sharegpt | ---
license: agpl-3.0
---
|
gart-labor/eclassQuery | ---
dataset_info:
features:
- name: did
dtype: int64
- name: query
dtype: string
- name: name
dtype: string
- name: duplicate_id
dtype: int64
- name: metalabel
dtype: int64
splits:
- name: train
num_bytes: 147176
num_examples: 1040
- name: eval
num_bytes: 100846
num_examples: 671
download_size: 113268
dataset_size: 248022
task_categories:
- sentence-similarity
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for "eclassQuery"
This Dataset consists of paraphrases of ECLASS-standard pump-properties. It can be used to evaluate models on the task of matching these paraphrases to the actual ECLASS-standard pump-properties based on their semantics. |
CyberHarem/de_ruyter_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of de_ruyter (Kantai Collection)
This is the dataset of de_ruyter (Kantai Collection), containing 191 images and their tags.
The core tags of this character are `long_hair, bangs, red_hair, blunt_bangs, braid, green_eyes, side_braid, hairband, black_hairband, curly_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 191 | 171.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_ruyter_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 191 | 114.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_ruyter_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 441 | 246.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_ruyter_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 191 | 159.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_ruyter_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 441 | 318.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/de_ruyter_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/de_ruyter_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, cowboy_shot, medium_breasts, navel, open_mouth, simple_background, frills, side-tie_bikini_bottom, smile, white_background, black_bikini, collarbone |
| 1 | 7 |  |  |  |  |  | 1girl, bell, black_thighhighs, green_sailor_collar, green_skirt, orange_neckerchief, pleated_skirt, sailor_shirt, serafuku, simple_background, solo, white_background, headgear, wavy_hair, open_mouth, chain, green_footwear |
| 2 | 12 |  |  |  |  |  | 1girl, green_sailor_collar, green_skirt, orange_neckerchief, pleated_skirt, sailor_shirt, serafuku, solo, bell, cowboy_shot, chain, looking_at_viewer, simple_background, smile, black_belt, headgear, open_mouth, white_background |
| 3 | 5 |  |  |  |  |  | 1girl, green_sailor_collar, open_mouth, orange_neckerchief, serafuku, simple_background, smile, solo, upper_body, bell, blush, sailor_shirt, short_sleeves, white_background, chain, headgear, looking_at_viewer, yellow_eyes |
| 4 | 21 |  |  |  |  |  | 1girl, solo, long_sleeves, blush, orange_scarf, smile, open_mouth, alternate_costume, coat, simple_background, white_background, looking_at_viewer, sweater, red_jacket, upper_body, black_skirt, black_thighhighs |
| 5 | 5 |  |  |  |  |  | 1girl, alternate_costume, detached_collar, fake_animal_ears, playboy_bunny, rabbit_ears, solo, strapless_leotard, wrist_cuffs, black_pantyhose, green_leotard, looking_at_viewer, simple_background, wavy_hair, cowboy_shot, one-hour_drawing_challenge, one_eye_closed, smile, white_background, ;d, \m/, bell, bowtie, green_footwear, medium_breasts, open_mouth, rabbit_tail, star_(symbol), twitter_username |
| 6 | 22 |  |  |  |  |  | 1girl, solo, bell, red_dress, black_gloves, capelet, christmas, fingerless_gloves, smile, black_belt, blush, hair_ornament, holly, medium_breasts, open_mouth, black_thighhighs, cleavage, corset, cowboy_shot, gift_box, looking_at_viewer, simple_background, white_background, one_eye_closed |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | cowboy_shot | medium_breasts | navel | open_mouth | simple_background | frills | side-tie_bikini_bottom | smile | white_background | black_bikini | collarbone | bell | black_thighhighs | green_sailor_collar | green_skirt | orange_neckerchief | pleated_skirt | sailor_shirt | serafuku | headgear | wavy_hair | chain | green_footwear | black_belt | upper_body | blush | short_sleeves | yellow_eyes | long_sleeves | orange_scarf | alternate_costume | coat | sweater | red_jacket | black_skirt | detached_collar | fake_animal_ears | playboy_bunny | rabbit_ears | strapless_leotard | wrist_cuffs | black_pantyhose | green_leotard | one-hour_drawing_challenge | one_eye_closed | ;d | \m/ | bowtie | rabbit_tail | star_(symbol) | twitter_username | red_dress | black_gloves | capelet | christmas | fingerless_gloves | hair_ornament | holly | corset | gift_box |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:--------------|:-----------------|:--------|:-------------|:--------------------|:---------|:-------------------------|:--------|:-------------------|:---------------|:-------------|:-------|:-------------------|:----------------------|:--------------|:---------------------|:----------------|:---------------|:-----------|:-----------|:------------|:--------|:-----------------|:-------------|:-------------|:--------|:----------------|:--------------|:---------------|:---------------|:--------------------|:-------|:----------|:-------------|:--------------|:------------------|:-------------------|:----------------|:--------------|:--------------------|:--------------|:------------------|:----------------|:-----------------------------|:-----------------|:-----|:------|:---------|:--------------|:----------------|:-------------------|:------------|:---------------|:----------|:------------|:--------------------|:----------------|:--------|:---------|:-----------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | | | X | | | | X | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | | X | X | X | | | X | X | | | X | X | | | X | | X | X | X | X | X | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | X | | | | X | X | | | X | X | | | X | | X | | X | | X | X | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 21 |  |  |  |  |  | X | | X | X | | | | X | X | | | X | X | | | | X | | | | | | | | | | | | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | X | X | | X | X | | | X | X | | | X | | | | | | | | | X | | X | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 6 | 22 |  |  |  |  |  | X | X | X | X | X | X | | X | X | | | X | X | | | X | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X |
|
PT-10/blood_transfusion_uci | ---
license: other
---
|
keelezibel/sentence_classification_dataset | ---
license: apache-2.0
---
This dataset is an automatically curated from three datasets.
- Wikipedia_AfD_imperative_data
- Spaadia
- SquadV2
- Samples from https://github.com/lettergram/sentence-classification/tree/master
Only 3 classes are available.
{"declarative": 0, "question": 1, "imperative": 2}
Note: As this dataset is automatically curated, it may not be the cleanest. Use at your own risk. |
Farsight-AI/10k_bench | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: cik
dtype: int64
- name: context
dtype: string
- name: filingDate
dtype: timestamp[s]
- name: docID
dtype: string
- name: generatedQuestion
dtype: string
splits:
- name: train
num_bytes: 100350
num_examples: 130
download_size: 56969
dataset_size: 100350
---
# Dataset Card for "10k_bench"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KevinTao511/pets | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': abyssinian
'1': basset
'2': beagle
splits:
- name: train
num_bytes: 32892148
num_examples: 289
download_size: 32848292
dataset_size: 32892148
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- image-classification
language:
- en
tags:
- pets
pretty_name: tao-ai-pets
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/shigure_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shigure/時雨/时雨 (Azur Lane)
This is the dataset of shigure/時雨/时雨 (Azur Lane), containing 162 images and their tags.
The core tags of this character are `animal_ears, black_hair, long_hair, red_eyes, breasts, wolf_ears, tail, ponytail, wolf_tail, medium_breasts, fang, bangs, animal_ear_fluff, wolf_girl`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 162 | 173.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 162 | 108.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 405 | 236.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 162 | 154.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 405 | 316.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shigure_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | blush, school_swimsuit, looking_at_viewer, 1girl, barefoot, solo, open_mouth, hair_ribbon, white_ribbon, bare_legs, blue_one-piece_swimsuit, smile, ass, soles, bare_shoulders, feet, nail_polish, bare_arms, blue_nails, simple_background, white_background |
| 1 | 25 |  |  |  |  |  | 1girl, pleated_skirt, looking_at_viewer, midriff, solo, cleavage, white_skirt, nail_polish, navel, open_mouth, claw_pose, red_nails, blush, serafuku, wristband, collarbone, crop_top, simple_background, white_background, :d, off_shoulder, short_sleeves, bare_shoulders, black_shirt, loose_socks, miniskirt |
| 2 | 5 |  |  |  |  |  | 1girl, black_garter_straps, dress, looking_at_viewer, official_alternate_costume, paw_gloves, solo, apron, black_footwear, black_thighhighs, blush, crossed_legs, full_body, high_heels, open_mouth, sitting, bell, gift, heart_balloon, rose, valentine, bow, brown_thighhighs, heart-shaped_box, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | school_swimsuit | looking_at_viewer | 1girl | barefoot | solo | open_mouth | hair_ribbon | white_ribbon | bare_legs | blue_one-piece_swimsuit | smile | ass | soles | bare_shoulders | feet | nail_polish | bare_arms | blue_nails | simple_background | white_background | pleated_skirt | midriff | cleavage | white_skirt | navel | claw_pose | red_nails | serafuku | wristband | collarbone | crop_top | :d | off_shoulder | short_sleeves | black_shirt | loose_socks | miniskirt | black_garter_straps | dress | official_alternate_costume | paw_gloves | apron | black_footwear | black_thighhighs | crossed_legs | full_body | high_heels | sitting | bell | gift | heart_balloon | rose | valentine | bow | brown_thighhighs | heart-shaped_box |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------------|:--------------------|:--------|:-----------|:-------|:-------------|:--------------|:---------------|:------------|:--------------------------|:--------|:------|:--------|:-----------------|:-------|:--------------|:------------|:-------------|:--------------------|:-------------------|:----------------|:----------|:-----------|:--------------|:--------|:------------|:------------|:-----------|:------------|:-------------|:-----------|:-----|:---------------|:----------------|:--------------|:--------------|:------------|:----------------------|:--------|:-----------------------------|:-------------|:--------|:-----------------|:-------------------|:---------------|:------------|:-------------|:----------|:-------|:-------|:----------------|:-------|:------------|:------|:-------------------|:-------------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | | X | X | | X | X | | | | | | | | X | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | | X | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
khalidalt/tydiqa-goldp | ---
pretty_name: TyDi QA
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
- ar
- bn
- fi
- id
- ja
- sw
- ko
- ru
- te
- th
license:
- apache-2.0
multilinguality:
- multilingual
size_categories:
- unknown
source_datasets:
- extended|wikipedia
task_categories:
- question-answering
task_ids:
- extractive-qa
paperswithcode_id: tydi-qa
---
# Dataset Card for "tydiqa"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/google-research-datasets/tydiqa](https://github.com/google-research-datasets/tydiqa)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3726.74 MB
- **Size of the generated dataset:** 5812.92 MB
- **Total amount of disk used:** 9539.67 MB
### Dataset Summary
TyDi QA is a question answering dataset covering 11 typologically diverse languages with 204K question-answer pairs.
The languages of TyDi QA are diverse with regard to their typology -- the set of linguistic features that each language
expresses -- such that we expect models performing well on this set to generalize across a large number of the languages
in the world. It contains language phenomena that would not be found in English-only corpora. To provide a realistic
information-seeking task and avoid priming effects, questions are written by people who want to know the answer, but
don’t know the answer yet, (unlike SQuAD and its descendents) and the data is collected directly in each language without
the use of translation (unlike MLQA and XQuAD).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### primary_task
- **Size of downloaded dataset files:** 1863.37 MB
- **Size of the generated dataset:** 5757.59 MB
- **Total amount of disk used:** 7620.96 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"annotations": {
"minimal_answers_end_byte": [-1, -1, -1],
"minimal_answers_start_byte": [-1, -1, -1],
"passage_answer_candidate_index": [-1, -1, -1],
"yes_no_answer": ["NONE", "NONE", "NONE"]
},
"document_plaintext": "\"\\nรองศาสตราจารย์[1] หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร (22 กันยายน 2495 -) ผู้ว่าราชการกรุงเทพมหานครคนที่ 15 อดีตรองหัวหน้าพรรคปร...",
"document_title": "หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร",
"document_url": "\"https://th.wikipedia.org/wiki/%E0%B8%AB%E0%B8%A1%E0%B9%88%E0%B8%AD%E0%B8%A1%E0%B8%A3%E0%B8%B2%E0%B8%8A%E0%B8%A7%E0%B8%87%E0%B8%...",
"language": "thai",
"passage_answer_candidates": "{\"plaintext_end_byte\": [494, 1779, 2931, 3904, 4506, 5588, 6383, 7122, 8224, 9375, 10473, 12563, 15134, 17765, 19863, 21902, 229...",
"question_text": "\"หม่อมราชวงศ์สุขุมพันธุ์ บริพัตร เรียนจบจากที่ไหน ?\"..."
}
```
#### secondary_task
- **Size of downloaded dataset files:** 1863.37 MB
- **Size of the generated dataset:** 55.34 MB
- **Total amount of disk used:** 1918.71 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"answers": {
"answer_start": [394],
"text": ["بطولتين"]
},
"context": "\"أقيمت البطولة 21 مرة، شارك في النهائيات 78 دولة، وعدد الفرق التي فازت بالبطولة حتى الآن 8 فرق، ويعد المنتخب البرازيلي الأكثر تت...",
"id": "arabic-2387335860751143628-1",
"question": "\"كم عدد مرات فوز الأوروغواي ببطولة كاس العالم لكرو القدم؟\"...",
"title": "قائمة نهائيات كأس العالم"
}
```
### Data Fields
The data fields are the same among all splits.
#### primary_task
- `passage_answer_candidates`: a dictionary feature containing:
- `plaintext_start_byte`: a `int32` feature.
- `plaintext_end_byte`: a `int32` feature.
- `question_text`: a `string` feature.
- `document_title`: a `string` feature.
- `language`: a `string` feature.
- `annotations`: a dictionary feature containing:
- `passage_answer_candidate_index`: a `int32` feature.
- `minimal_answers_start_byte`: a `int32` feature.
- `minimal_answers_end_byte`: a `int32` feature.
- `yes_no_answer`: a `string` feature.
- `document_plaintext`: a `string` feature.
- `document_url`: a `string` feature.
#### secondary_task
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name | train | validation |
| -------------- | -----: | ---------: |
| primary_task | 166916 | 18670 |
| secondary_task | 49881 | 5077 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{tydiqa,
title = {TyDi QA: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author = {Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki}
year = {2020},
journal = {Transactions of the Association for Computational Linguistics}
}
```
```
@inproceedings{ruder-etal-2021-xtreme,
title = "{XTREME}-{R}: Towards More Challenging and Nuanced Multilingual Evaluation",
author = "Ruder, Sebastian and
Constant, Noah and
Botha, Jan and
Siddhant, Aditya and
Firat, Orhan and
Fu, Jinlan and
Liu, Pengfei and
Hu, Junjie and
Garrette, Dan and
Neubig, Graham and
Johnson, Melvin",
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
month = nov,
year = "2021",
address = "Online and Punta Cana, Dominican Republic",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.emnlp-main.802",
doi = "10.18653/v1/2021.emnlp-main.802",
pages = "10215--10245",
}
}
```
|
krvhrv/Healix-2.8B-Token-Medical-Shot | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 18067633753
num_examples: 1388866
download_size: 8989892446
dataset_size: 18067633753
---
# Dataset Card for "Healix-2.8B-Token-Medical-Shot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anasmkh/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yyu/reddit-attrprompt | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- social_media
size_categories:
- 10K<n<100K
---
This is the data used in the paper [Large Language Model as Attributed Training Data Generator: A Tale of Diversity and Bias](https://github.com/yueyu1030/AttrPrompt).
Checkout the paper https://arxiv.org/abs/2306.15895 for details.
- `label.txt`: the label name for each class
- `train.jsonl`: The original training set.
- `valid.jsonl`: The original validation set.
- `test.jsonl`: The original test set.
- `simprompt.jsonl`: The training data generated by the simple prompt.
- `attrprompt.jsonl`: The training data generated by the attributed prompt.
Please cite the original paper if you use this dataset for your study. Thanks!
```
@article{geigle:2021:arxiv,
author = {Gregor Geigle and
Nils Reimers and
Andreas R{\"u}ckl{\'e} and
Iryna Gurevych},
title = {TWEAC: Transformer with Extendable QA Agent Classifiers},
journal = {arXiv preprint},
volume = {abs/2104.07081},
year = {2021},
url = {http://arxiv.org/abs/2104.07081},
archivePrefix = {arXiv},
eprint = {2104.07081}
}
@article{yu2023large,
title={Large Language Model as Attributed Training Data Generator: A Tale of Diversity and Bias},
author={Yu, Yue and Zhuang, Yuchen and Zhang, Jieyu and Meng, Yu and Ratner, Alexander and Krishna, Ranjay and Shen, Jiaming and Zhang, Chao},
journal={arXiv preprint arXiv:2306.15895},
year={2023}
}
``` |
djscrave/tsh | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.csv"
- split: validation
path: "validation.csv"
- split: test
path: "test.csv"
license: openrail
task_categories:
- text-classification
language:
- fr
tags:
- chemistry
size_categories:
- 1K<n<10K
--- |
Pixelatory/ZINC20-Druglike | ---
tags:
- chemistry
size_categories:
- 1B<n<10B
configs:
- config_name: default
data_files: "zinc-druglike-cano.tar.xz"
---
Downloaded Nov 25th 2023 from ZINC-20 (https://zinc20.docking.org/).
Contains the unique canonicalized SMILES molecules, done with RDKit.
Used this filtering criteria in the ZINC-20 tranches:
- 2D representation
- Standard reaction
- Annotated purchasability
- Drug-like subset
Distribution of tokenized SMILES sequence lengths below, along with the top 15 SMILES tokens (with their frequencies). The following regex string was used
to split the SMILES molecule into tokens: (\[[^\]]+]|Br?|Cl?|N|O|S|P|F|I|b|c|n|o|s|p|\(|\)|\.|=|#|-|\+|\\|\/|:|~|@|\?|>>?|\*|\$|\%[0-9]{2}|[0-9])

 |
definite_pronoun_resolution | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- word-sense-disambiguation
paperswithcode_id: definite-pronoun-resolution-dataset
pretty_name: Definite Pronoun Resolution Dataset
dataset_info:
features:
- name: sentence
dtype: string
- name: pronoun
dtype: string
- name: candidates
sequence: string
length: 2
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
config_name: plain_text
splits:
- name: test
num_bytes: 71691
num_examples: 564
- name: train
num_bytes: 171511
num_examples: 1322
download_size: 227452
dataset_size: 243202
---
# Dataset Card for "definite_pronoun_resolution"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://www.hlt.utdallas.edu/~vince/data/emnlp12/](https://www.hlt.utdallas.edu/~vince/data/emnlp12/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 0.23 MB
- **Size of the generated dataset:** 0.24 MB
- **Total amount of disk used:** 0.47 MB
### Dataset Summary
Composed by 30 students from one of the author's undergraduate classes. These
sentence pairs cover topics ranging from real events (e.g., Iran's plan to
attack the Saudi ambassador to the U.S.) to events/characters in movies (e.g.,
Batman) and purely imaginary situations, largely reflecting the pop culture as
perceived by the American kids born in the early 90s. Each annotated example
spans four lines: the first line contains the sentence, the second line contains
the target pronoun, the third line contains the two candidate antecedents, and
the fourth line contains the correct antecedent. If the target pronoun appears
more than once in the sentence, its first occurrence is the one to be resolved.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### plain_text
- **Size of downloaded dataset files:** 0.23 MB
- **Size of the generated dataset:** 0.24 MB
- **Total amount of disk used:** 0.47 MB
An example of 'train' looks as follows.
```
{
"candidates": ["coreference resolution", "chunking"],
"label": 0,
"pronoun": "it",
"sentence": "There is currently more work on coreference resolution than on chunking because it is a problem that is still far from being solved."
}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `sentence`: a `string` feature.
- `pronoun`: a `string` feature.
- `candidates`: a `list` of `string` features.
- `label`: a classification label, with possible values including `0` (0), `1` (1).
### Data Splits
| name |train|test|
|----------|----:|---:|
|plain_text| 1322| 564|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{rahman2012resolving,
title={Resolving complex cases of definite pronouns: the winograd schema challenge},
author={Rahman, Altaf and Ng, Vincent},
booktitle={Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning},
pages={777--789},
year={2012},
organization={Association for Computational Linguistics}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten) for adding this dataset. |
TREC-AToMiC/AToMiC-Texts-v0.2.1 | ---
dataset_info:
features:
- name: text_id
dtype: string
- name: page_url
dtype: string
- name: page_title
dtype: string
- name: section_title
dtype: string
- name: context_page_description
dtype: string
- name: context_section_description
dtype: string
- name: media
sequence: string
- name: hierachy
sequence: string
- name: category
sequence: string
- name: source_id
dtype: string
splits:
- name: train
num_bytes: 20393084595
num_examples: 10134744
download_size: 7192298025
dataset_size: 20393084595
---
# Dataset Card for "AToMiC-Texts-v0.2.updated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/movie_poster_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 35799
num_examples: 1000
download_size: 1361
dataset_size: 35799
---
# Dataset Card for "movie_poster_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shuyuej/temporary_consistency_data | ---
license: apache-2.0
--- |
open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B | ---
pretty_name: Evaluation run of ArianAskari/NeuralHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ArianAskari/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/ArianAskari/NeuralHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T00:23:23.624416](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-02-02T00-23-23.624416.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6381118013498635,\n\
\ \"acc_stderr\": 0.03228022346468125,\n \"acc_norm\": 0.6407857748626177,\n\
\ \"acc_norm_stderr\": 0.032921741406603505,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5223229691481044,\n\
\ \"mc2_stderr\": 0.015242725441292206\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670733,\n\
\ \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840055\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6520613423620792,\n\
\ \"acc_stderr\": 0.00475342980664544,\n \"acc_norm\": 0.842760406293567,\n\
\ \"acc_norm_stderr\": 0.003632825479128597\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138215,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138215\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083525,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.02784081149587193,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.02784081149587193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886797,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886797\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530343,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530343\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.015461169002371542,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.015461169002371542\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464485,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464485\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806304,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806304\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.027403859410786845,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.027403859410786845\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.016801860466677157,\n \"mc2\": 0.5223229691481044,\n\
\ \"mc2_stderr\": 0.015242725441292206\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \
\ \"acc_stderr\": 0.013642195352511564\n }\n}\n```"
repo_url: https://huggingface.co/ArianAskari/NeuralHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T00-23-23.624416.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- '**/details_harness|winogrande|5_2024-02-02T00-23-23.624416.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T00-23-23.624416.parquet'
- config_name: results
data_files:
- split: 2024_02_02T00_23_23.624416
path:
- results_2024-02-02T00-23-23.624416.parquet
- split: latest
path:
- results_2024-02-02T00-23-23.624416.parquet
---
# Dataset Card for Evaluation run of ArianAskari/NeuralHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ArianAskari/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/ArianAskari/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T00:23:23.624416](https://huggingface.co/datasets/open-llm-leaderboard/details_ArianAskari__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-02-02T00-23-23.624416.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6381118013498635,
"acc_stderr": 0.03228022346468125,
"acc_norm": 0.6407857748626177,
"acc_norm_stderr": 0.032921741406603505,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5223229691481044,
"mc2_stderr": 0.015242725441292206
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670733,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840055
},
"harness|hellaswag|10": {
"acc": 0.6520613423620792,
"acc_stderr": 0.00475342980664544,
"acc_norm": 0.842760406293567,
"acc_norm_stderr": 0.003632825479128597
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138215,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.02784081149587193,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.02784081149587193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886797,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886797
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530343,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530343
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.034063153607115086,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.034063153607115086
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993459,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371542,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371542
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464485,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464485
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806304,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806304
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786845,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786845
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.016801860466677157,
"mc2": 0.5223229691481044,
"mc2_stderr": 0.015242725441292206
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089688
},
"harness|gsm8k|5": {
"acc": 0.5686125852918877,
"acc_stderr": 0.013642195352511564
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MAdAiLab/amazon-attrprompt | ---
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 8106318.4
num_examples: 12144
- name: validation
num_bytes: 1013289.8
num_examples: 1518
- name: test
num_bytes: 1013289.8
num_examples: 1518
download_size: 6571131
dataset_size: 10132898.000000002
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_alnrg2arg__test | ---
pretty_name: Evaluation run of alnrg2arg/test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alnrg2arg/test](https://huggingface.co/alnrg2arg/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alnrg2arg__test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T17:13:28.432807](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test/blob/main/results_2024-01-13T17-13-28.432807.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23383463507765326,\n\
\ \"acc_stderr\": 0.03001653266955312,\n \"acc_norm\": 0.2331114809676361,\n\
\ \"acc_norm_stderr\": 0.030796413035811515,\n \"mc1\": 1.0,\n \
\ \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n\
\ },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22866894197952217,\n\
\ \"acc_stderr\": 0.012272853582540788,\n \"acc_norm\": 0.23037542662116042,\n\
\ \"acc_norm_stderr\": 0.01230492841874761\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.2529376618203545,\n \"acc_stderr\": 0.004338071318912311,\n\
\ \"acc_norm\": 0.2523401712806214,\n \"acc_norm_stderr\": 0.004334676952703861\n\
\ },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n\
\ \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n\
\ \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n\
\ \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n\
\ \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n\
\ \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n\
\ \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n\
\ \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n\
\ \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n\
\ \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2425531914893617,\n\
\ \"acc_stderr\": 0.028020226271200217,\n \"acc_norm\": 0.2425531914893617,\n\
\ \"acc_norm_stderr\": 0.028020226271200217\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n\
\ \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"\
acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23544973544973544,\n \"acc_stderr\": 0.021851509822031708,\n \"\
acc_norm\": 0.23544973544973544,\n \"acc_norm_stderr\": 0.021851509822031708\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"\
acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.030313710538198906,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.030313710538198906\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23333333333333334,\n \"acc_stderr\": 0.021444547301560486,\n\
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.021444547301560486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.17880794701986755,\n \"acc_stderr\": 0.031287448506007245,\n \"\
acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.031287448506007245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.015491088951494586,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.015491088951494586\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.025770015644290392,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.025770015644290392\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676653,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676653\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.20915032679738563,\n \"acc_stderr\": 0.016453399332279326,\n \
\ \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.016453399332279326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n\
\ \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n\
\ \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.03446296217088426,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.03446296217088426\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\"\
: 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5114443567482242,\n \"acc_stderr\": 0.014048804199859325\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/alnrg2arg/test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T17-13-28.432807.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- '**/details_harness|winogrande|5_2024-01-13T17-13-28.432807.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T17-13-28.432807.parquet'
- config_name: results
data_files:
- split: 2024_01_13T17_13_28.432807
path:
- results_2024-01-13T17-13-28.432807.parquet
- split: latest
path:
- results_2024-01-13T17-13-28.432807.parquet
---
# Dataset Card for Evaluation run of alnrg2arg/test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alnrg2arg/test](https://huggingface.co/alnrg2arg/test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alnrg2arg__test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T17:13:28.432807](https://huggingface.co/datasets/open-llm-leaderboard/details_alnrg2arg__test/blob/main/results_2024-01-13T17-13-28.432807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23383463507765326,
"acc_stderr": 0.03001653266955312,
"acc_norm": 0.2331114809676361,
"acc_norm_stderr": 0.030796413035811515,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22866894197952217,
"acc_stderr": 0.012272853582540788,
"acc_norm": 0.23037542662116042,
"acc_norm_stderr": 0.01230492841874761
},
"harness|hellaswag|10": {
"acc": 0.2529376618203545,
"acc_stderr": 0.004338071318912311,
"acc_norm": 0.2523401712806214,
"acc_norm_stderr": 0.004334676952703861
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23544973544973544,
"acc_stderr": 0.021851509822031708,
"acc_norm": 0.23544973544973544,
"acc_norm_stderr": 0.021851509822031708
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.030313710538198906,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.030313710538198906
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.021444547301560486,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.021444547301560486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.031287448506007245,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.031287448506007245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.015491088951494586,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.015491088951494586
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.025770015644290392,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.025770015644290392
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676653,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676653
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.016453399332279326,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.016453399332279326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.03446296217088426,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.03446296217088426
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.5114443567482242,
"acc_stderr": 0.014048804199859325
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
msra_ner | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- zh
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: MSRA NER
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
config_name: msra_ner
splits:
- name: train
num_bytes: 33323074
num_examples: 45001
- name: test
num_bytes: 2642934
num_examples: 3443
download_size: 15156606
dataset_size: 35966008
train-eval-index:
- config: msra_ner
task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: test
col_mapping:
tokens: tokens
ner_tags: tags
metrics:
- type: seqeval
name: seqeval
---
# Dataset Card for MSRA NER
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/OYE93/Chinese-NLP-Corpus/tree/master/NER/MSRA)
- **Repository:** [Github](https://github.com/OYE93/Chinese-NLP-Corpus)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@JetRunner](https://github.com/JetRunner) for adding this dataset. |
sam-2577/resume-dataset | ---
task_categories:
- text-classification
language:
- en
size_categories:
- 1K<n<10K
--- |
jacobbieker/goes-imerg-6hour-test | ---
license: mit
---
|
Katrg/Kafka-HonkaiStarRail | ---
license: creativeml-openrail-m
tags:
- lora
- aiartchan
- stable-diffusion
- art
---
## LoRA Description
This is LoRA trained on the Kafka character art from Honkai: Star Rail.
Enjoy using it!
### CivitAi: https://civitai.com/models/67079/kafka-honkai-star-rail
## Example images
 |
fightfei/advices_llama2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 392092356.0
num_examples: 1430994
- name: test
num_bytes: 8001896.0
num_examples: 29204
download_size: 46548081
dataset_size: 400094252.0
---
# Dataset Card for "advices_llama2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FelixdoingAI/IP2P-5000 | ---
dataset_info:
features:
- name: original_prompt
dtype: string
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 430361394.0
num_examples: 5000
download_size: 430263289
dataset_size: 430361394.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instructpix2pix-clip-filtered5000-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mattyhatch/tomatoesTest5 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
struct:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
splits:
- name: train
num_bytes: 352830773.0
num_examples: 557
download_size: 51228401
dataset_size: 352830773.0
---
# Dataset Card for "tomatoesTest5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Iker/InstructTranslation-EN-ES | ---
language:
- en
- es
license: apache-2.0
size_categories:
- 1K<n<10K
task_categories:
- translation
- text-generation
- text2text-generation
dataset_info:
features:
- name: id
dtype: int64
- name: conversation_no
dtype: int64
- name: from
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 2080611
num_examples: 1459
download_size: 1023377
dataset_size: 2080611
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Translation of Instructions EN-ES
This dataset contains prompts and answers from [teknium/OpenHermes-2.5](teknium/OpenHermes-2.5) translated to Spanish using GPT-4-0125-preview. The dataset is intended to be used for training a model to translate instructions from English to Spanish.
The dataset is formatted with the [TowerInstruct](https://huggingface.co/Unbabel/TowerInstruct-13B-v0.1) format. It is ready to finetune a Tower translation model. if you want the raw translations, there are available here: https://huggingface.co/datasets/Iker/InstructTranslation-EN-ES-Raw
`example_no` and `conversation_no` corresponds to the original example id and conversation number in OpenHermes-2.5. Prompts are labeled as `human` and answers are labeled as `gpt`. `Conversations` is the data you should use for finetuning.
|
vietgpt/databricks_dolly15k_vi | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 14450287
num_examples: 15004
download_size: 7217068
dataset_size: 14450287
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
--- |
textminr/ner | ---
configs:
- config_name: default
data_files:
- split: train
path: "train.jsonl"
- split: validation
path: "validation.jsonl"
language:
- en
- de
size_categories:
- n<1K
---
Data generated by the GPT-4 API! |
GautamR/grievance_agri | ---
license: apache-2.0
---
|
BDARUI03/finetuned_dataset | ---
license: apache-2.0
---
|
567-labs/upload-test | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: embedding
sequence: float64
splits:
- name: train
num_bytes: 1015245725
num_examples: 277027
download_size: 1013958395
dataset_size: 1015245725
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
scherrmann/adhoc_quad | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
splits:
- name: train
num_bytes: 10365360
num_examples: 6659
- name: validation
num_bytes: 1157605
num_examples: 748
download_size: 3088466
dataset_size: 11522965
---
# Dataset Card for "adhoc_quad"
## Dataset Summary
The German Ad-Hoc Question Answering Dataset (AdHocQuAD) is a reading comprehension dataset for German financial texts. It is a machine generated dataset, where ChatGPT (Version 3.5) is used to ask questions on a set of German Ad-Hoc announcements. The answer to every question is a segment of text, or span, from the corresponding reading passage.
## Supported Tasks and Leaderboards
extractive-qa, closed-domain-qa, open-domain-qa, text-retrieval: This dataset is intended to be used for open-domain-qa, but can also be used for information retrieval tasks.
## Languages
The texts in the dataset are in German (de).
# Dataset Structure
## Data Instances
A sample from the training set is provided below:
{
"context": "This is a test context with eight words.",
"id": "1",
"question": "How many words contains the context?",
"answers": {
"answer_start": [28],
"text": ["eight"]
}
}
## Data Fields
id: a string feature.
context: a string feature.
question: a string feature.
answers: a dictionary feature containing:
text: a string feature.
answer_start: a int32 feature.
# Additional Information
## Details on the Generation of the Ad-Hoc QuAD Database
To construct the ad-hoc QuAD database, I use 9,132 German ad-hoc announcements as context strings. Announcements exceeding 15 sentences are truncated to ensure compatibility with BERT's input limitations in subsequent applications.
After that, there is a need to identify questions and appropriate answers that reference the given ad-hoc announcements. Given that manual generation of questions and answers is both resource-intensive and time-consuming, I employ the OpenAI's ChatGPT model (gpt-3.5-turbo).
In a first step, I ask ChatGPT to generate three suitable questions for a given announcement. The prompt looks as follows:
Create three questions for the following text.
It should be possible to answer the question with a substring of the input text.
The questions should ask for different aspects of the input.
The questions should be in German.
Text: <<context>>
Question:
In the pursuit of creating an extractive QuAD task, it is imperative to instruct the model such that every question can be answered using a substring from the provided announcement. This strategy aims to prevent the model from generating open-ended questions or those requiring external knowledge not present in the announcement. Additionally, the model is directed to address various aspects of the announcement to minimize question redundancy. Notably, despite the context strings being in German, ChatGPT occasionally formulates questions in English. To counteract this, explicit instructions are given to ensure questions are posed in German. Employing this methodology yields 9,132 unique context-question pairs.
In a second step, I use ChatGPT again to extract the substring that answers to question to a specific context string. The respective prompt is given by:
You have given a text and a question to that text. Find the answer as a substring of the input text.
It is crucial that the answer is contained exactly as a substring in the input text, even if this implies that the answer is not a full sentence.
Example:
Text: 'Herr Müller ist 37 Jahre alt.'
Question: 'Wie alt ist Herr Müller?'
Answer: '37 Jahre'
Text: <<context>>
Question: <<question>>
Answer:
Evaluations of the method of extracting substrings from a specified context to answer a posed question via ChatGPT indicated a recurrent issue: ChatGPT frequently transformed the substring into a complete sentence, thereby compromising the extractive nature of the resultant database. Emphasizing the necessity for extractive answers, coupled with a demonstrative example, markedly enhanced the outcomes. However, of the responses generated by ChatGPT, 1,725 are not given as substrings of the context, leading to a final ad-hoc QuAD database size of 7,407.
The code for creating the dataset can be found [here](https://github.com/FinTexIFB/AdHocQuAD).
## Dataset Curators
The dataset was created by Moritz Scherrmann using ChatGPT 3.5 turbo
## Citation Information
@misc{scherrmann2023german,
title={German FinBERT: A German Pre-trained Language Model},
author={Moritz Scherrmann},
year={2023},
eprint={2311.08793},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
|
liuyanchen1015/MULTI_VALUE_qqp_double_obj_order | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 2600151
num_examples: 16897
- name: test
num_bytes: 26136993
num_examples: 169633
- name: train
num_bytes: 23145826
num_examples: 150490
download_size: 31862651
dataset_size: 51882970
---
# Dataset Card for "MULTI_VALUE_qqp_double_obj_order"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huggingfacejs/tasks | ---
license: mit
---
This dataset is for storing assets for https://huggingface.co/tasks and https://github.com/huggingface/huggingface.js/tree/main/packages/tasks |
dharani96556/kaggle-datasets | ---
license: openrail
---
|
open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0 | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-falcon-180b-v13-preview0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-falcon-180b-v13-preview0](https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-24T12:56:17.890074](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0/blob/main/results_2023-10-24T12-56-17.890074.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.490876677852349,\n\
\ \"em_stderr\": 0.005119615515857085,\n \"f1\": 0.5498133389261767,\n\
\ \"f1_stderr\": 0.004838031306299291,\n \"acc\": 0.6212929481268546,\n\
\ \"acc_stderr\": 0.01211195240749183\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.490876677852349,\n \"em_stderr\": 0.005119615515857085,\n\
\ \"f1\": 0.5498133389261767,\n \"f1_stderr\": 0.004838031306299291\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4162244124336619,\n \
\ \"acc_stderr\": 0.013577788334652662\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480331\n\
\ }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|arc:challenge|25_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|arc:challenge|25_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_24T12_56_17.890074
path:
- '**/details_harness|drop|3_2023-10-24T12-56-17.890074.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-24T12-56-17.890074.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_24T12_56_17.890074
path:
- '**/details_harness|gsm8k|5_2023-10-24T12-56-17.890074.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-24T12-56-17.890074.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hellaswag|10_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hellaswag|10_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-27-08.727010.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T09-27-08.727010.parquet'
- split: 2023_10_11T10_53_08.711708
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T10-53-08.711708.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-11T10-53-08.711708.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_24T12_56_17.890074
path:
- '**/details_harness|winogrande|5_2023-10-24T12-56-17.890074.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-24T12-56-17.890074.parquet'
- config_name: results
data_files:
- split: 2023_10_11T09_27_08.727010
path:
- results_2023-10-11T09-27-08.727010.parquet
- split: 2023_10_11T10_53_08.711708
path:
- results_2023-10-11T10-53-08.711708.parquet
- split: 2023_10_24T12_56_17.890074
path:
- results_2023-10-24T12-56-17.890074.parquet
- split: latest
path:
- results_2023-10-24T12-56-17.890074.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-falcon-180b-v13-preview0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-falcon-180b-v13-preview0](https://huggingface.co/OpenBuddy/openbuddy-falcon-180b-v13-preview0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T12:56:17.890074](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-falcon-180b-v13-preview0/blob/main/results_2023-10-24T12-56-17.890074.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.490876677852349,
"em_stderr": 0.005119615515857085,
"f1": 0.5498133389261767,
"f1_stderr": 0.004838031306299291,
"acc": 0.6212929481268546,
"acc_stderr": 0.01211195240749183
},
"harness|drop|3": {
"em": 0.490876677852349,
"em_stderr": 0.005119615515857085,
"f1": 0.5498133389261767,
"f1_stderr": 0.004838031306299291
},
"harness|gsm8k|5": {
"acc": 0.4162244124336619,
"acc_stderr": 0.013577788334652662
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
openclimatefix/gdps | ---
license: other
license_name: canada-government-license
license_link: LICENSE
---
|
autoevaluate/autoeval-staging-eval-squad_v2-squad_v2-76c05b-14906066 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad_v2
eval_info:
task: extractive_question_answering
model: deepset/roberta-base-squad2
metrics: ['bertscore']
dataset_name: squad_v2
dataset_config: squad_v2
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: deepset/roberta-base-squad2
* Dataset: squad_v2
* Config: squad_v2
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nonchalant-nagavalli](https://huggingface.co/nonchalant-nagavalli) for evaluating this model. |
BatsResearch/NusaX-senti-LexC-Gen | ---
language:
- ace
- ban
- bbc
- bjn
- bug
- mad
- min
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: LexC-Gen generated data for NusaX-senti
tags:
- synthetic
dataset_info:
- config_name: ace_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6270772
num_examples: 37075
- name: validation
num_bytes: 20197
num_examples: 100
download_size: 3810625
dataset_size: 6290969
- config_name: ace_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 644037
num_examples: 3804
- name: validation
num_bytes: 20197
num_examples: 100
download_size: 404952
dataset_size: 664234
- config_name: ace_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 65615
num_examples: 371
- name: validation
num_bytes: 20197
num_examples: 100
download_size: 56527
dataset_size: 85812
- config_name: ban_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7139238
num_examples: 37954
- name: validation
num_bytes: 22552
num_examples: 100
download_size: 4214903
dataset_size: 7161790
- config_name: ban_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 711527
num_examples: 3796
- name: validation
num_bytes: 22552
num_examples: 100
download_size: 435747
dataset_size: 734079
- config_name: ban_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 72895
num_examples: 382
- name: validation
num_bytes: 22552
num_examples: 100
download_size: 61808
dataset_size: 95447
- config_name: bbc_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6462402
num_examples: 36956
- name: validation
num_bytes: 20630
num_examples: 100
download_size: 3953108
dataset_size: 6483032
- config_name: bbc_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 650856
num_examples: 3717
- name: validation
num_bytes: 20630
num_examples: 100
download_size: 413181
dataset_size: 671486
- config_name: bbc_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 67906
num_examples: 381
- name: validation
num_bytes: 20630
num_examples: 100
download_size: 59193
dataset_size: 88536
- config_name: bjn_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6574543
num_examples: 38185
- name: validation
num_bytes: 20669
num_examples: 100
download_size: 4013692
dataset_size: 6595212
- config_name: bjn_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 664710
num_examples: 3873
- name: validation
num_bytes: 20669
num_examples: 100
download_size: 421679
dataset_size: 685379
- config_name: bjn_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 64505
num_examples: 369
- name: validation
num_bytes: 20669
num_examples: 100
download_size: 56860
dataset_size: 85174
- config_name: bug_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6769177
num_examples: 37407
- name: validation
num_bytes: 21691
num_examples: 100
download_size: 3951245
dataset_size: 6790868
- config_name: bug_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 675851
num_examples: 3747
- name: validation
num_bytes: 21691
num_examples: 100
download_size: 408844
dataset_size: 697542
- config_name: bug_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 65115
num_examples: 358
- name: validation
num_bytes: 21691
num_examples: 100
download_size: 55874
dataset_size: 86806
- config_name: default
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 7287147
num_examples: 37954
download_size: 4196431
dataset_size: 7287147
- config_name: mad_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 7288971
num_examples: 37656
- name: validation
num_bytes: 23483
num_examples: 100
download_size: 4201408
dataset_size: 7312454
- config_name: mad_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 701910
num_examples: 3652
- name: validation
num_bytes: 23483
num_examples: 100
download_size: 421862
dataset_size: 725393
- config_name: mad_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 66557
num_examples: 350
- name: validation
num_bytes: 23483
num_examples: 100
download_size: 57540
dataset_size: 90040
- config_name: min_100k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6300116
num_examples: 36381
- name: validation
num_bytes: 20600
num_examples: 100
download_size: 3864906
dataset_size: 6320716
- config_name: min_10k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 672223
num_examples: 3850
- name: validation
num_bytes: 20600
num_examples: 100
download_size: 427653
dataset_size: 692823
- config_name: min_1k
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 63646
num_examples: 380
- name: validation
num_bytes: 20600
num_examples: 100
download_size: 57120
dataset_size: 84246
configs:
- config_name: ace_100k
data_files:
- split: train
path: ace_100k/train-*
- split: validation
path: ace_100k/validation-*
- config_name: ace_10k
data_files:
- split: train
path: ace_10k/train-*
- split: validation
path: ace_10k/validation-*
- config_name: ace_1k
data_files:
- split: train
path: ace_1k/train-*
- split: validation
path: ace_1k/validation-*
- config_name: ban_100k
data_files:
- split: train
path: ban_100k/train-*
- split: validation
path: ban_100k/validation-*
- config_name: ban_10k
data_files:
- split: train
path: ban_10k/train-*
- split: validation
path: ban_10k/validation-*
- config_name: ban_1k
data_files:
- split: train
path: ban_1k/train-*
- split: validation
path: ban_1k/validation-*
- config_name: bbc_100k
data_files:
- split: train
path: bbc_100k/train-*
- split: validation
path: bbc_100k/validation-*
- config_name: bbc_10k
data_files:
- split: train
path: bbc_10k/train-*
- split: validation
path: bbc_10k/validation-*
- config_name: bbc_1k
data_files:
- split: train
path: bbc_1k/train-*
- split: validation
path: bbc_1k/validation-*
- config_name: bjn_100k
data_files:
- split: train
path: bjn_100k/train-*
- split: validation
path: bjn_100k/validation-*
- config_name: bjn_10k
data_files:
- split: train
path: bjn_10k/train-*
- split: validation
path: bjn_10k/validation-*
- config_name: bjn_1k
data_files:
- split: train
path: bjn_1k/train-*
- split: validation
path: bjn_1k/validation-*
- config_name: bug_100k
data_files:
- split: train
path: bug_100k/train-*
- split: validation
path: bug_100k/validation-*
- config_name: bug_10k
data_files:
- split: train
path: bug_10k/train-*
- split: validation
path: bug_10k/validation-*
- config_name: bug_1k
data_files:
- split: train
path: bug_1k/train-*
- split: validation
path: bug_1k/validation-*
- config_name: mad_100k
data_files:
- split: train
path: mad_100k/train-*
- split: validation
path: mad_100k/validation-*
- config_name: mad_10k
data_files:
- split: train
path: mad_10k/train-*
- split: validation
path: mad_10k/validation-*
- config_name: mad_1k
data_files:
- split: train
path: mad_1k/train-*
- split: validation
path: mad_1k/validation-*
- config_name: min_100k
data_files:
- split: train
path: min_100k/train-*
- split: validation
path: min_100k/validation-*
- config_name: min_10k
data_files:
- split: train
path: min_10k/train-*
- split: validation
path: min_10k/validation-*
- config_name: min_1k
data_files:
- split: train
path: min_1k/train-*
- split: validation
path: min_1k/validation-*
---
# Dataset Card for NusaX-senti-LexC-Gen
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Project Page](https://batsresearch.github.io/lexcgen/)
- **Repository:** [Github Repo](https://github.com/BatsResearch/LexC-Gen)
- **Paper:** [Arxiv](https://arxiv.org/abs/2402.14086)
- **Point of Contact:** [Zheng-Xin Yong](mailto:contact.yong@brown.edu)
### Dataset Summary
The LexC-Gen dataset for [NusaX sentiment analysis](https://huggingface.co/datasets/indonlp/NusaX-senti) task is a dataset generated for low-resource languages at scale with Large Language Models ([BLOOMZ-7.1B](https://arxiv.org/abs/2211.01786)) and [Gatitos bilingual lexicons](https://aclanthology.org/2023.emnlp-main.26/).
```python3
from datasets import load_dataset
dataset = load_dataset("BatsResearch/NusaX-senti-LexC-Gen", "ace_100k")
```
### Supported Tasks and Leaderboards
- `text-classification`, `sentiment-classification`: The dataset can be used to train a model for sentiment classification. The model performance is evaluated based on the accuracy of the predicted labels as compared to the given labels in the dataset.
### Languages
The text cover 7 extremely low-resource languages:
- Acehnese (`ace`)
- Balinese (`ban`)
- Toba batak (`bbc`)
- Buginese (`bug`)
- Banjarese (`bjn`)
- Madurese (`mad`)
- Minangkabau (`min`).
## Dataset Structure
### Data Instances
Each data instance contains the following features: _id_, _text_ and _label_. The _label_ has 3 possible values, _0_, _1_, _2_, which correspond to _negative_, _neutral_, and _positive_.
An example from the LexC-Gen train set looks like the following:
```
{'id': '1',
'text': 'Anchorwoman : Hai , pubuet n't reuhung atra aneuk kumuen meulawan buli aneuk miet , ikat atra getnyan fingers ngeun saboh boh manok ngeun jangka gobnyan ho saboh pillar .'
'label': 1}
```
### Data Fields
- 'id': unique id
- 'text': generated text from LLMs
- 'label': an integer, either _0_, _1_, or _2_, indicating _negative_, _neutral_, and _positive_.
### Data Splits
The LexC-Gen dataset requires config name in the format of `{lang}_{size}`. The `lang` refers to the language code, and the `size` refers to the size of LexC-Gen dataset before input-label consistency filtering, which takes values of `1k`, `10k`, or `100k`.
The LexC-Gen dataset has 2 splits: _train_, _validation_. The _train_ split refers to the generated LexC-Gen task training data. The _validation_ split refers to the NusaX validation data that has been word translated.
## Dataset Creation
### Curation Rationale
Extremely low-resource languages have virtually no labeled data. We explore generating data at scale for these languages using high-resource-language task data, LLMs, and bilingual lexicons to overcome the data bottleneck.
We upload the dataset to Huggingface as an artifact of our research and to ensure reproducibility of our results in our paper.
## Considerations for Using the Data
Our dataset is a synthetic dataset generated in English by LLMs and then translated into low-resource languages through word-to-word translation with bilingual lexicons.
It may contain English words due to imperfect translation, and it uses English syntax such as SVO word order, which is not necessarily representative of the syntax of the low-resource languages.
## Additional Information
### Dataset Curators
The LexC-Gen synthetic dataset is created by Zheng-Xin Yong.
### Licensing Information
Our dataset is generated from BLOOMZ models, which uses the [BigScience RAIL License v1.0](https://huggingface.co/spaces/bigscience/license). Therefore, the RAIL license would apply to classifiers that are finetuned on our LexC-Gen dataset.
### Citation Information
```
@misc{yong2024lexcgen,
title={LexC-Gen: Generating Data for Extremely Low-Resource Languages with Large Language Models and Bilingual Lexicons},
author={Zheng-Xin Yong and Cristina Menghini and Stephen H. Bach},
year={2024},
eprint={2402.14086},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
shukurullo2004/animals5 | ---
license: apache-2.0
---
|
sharmapratik88/Resume_Classification | ---
license: cc
dataset_info:
features:
- name: Resume_str
dtype: string
- name: Category
dtype: string
- name: ID
dtype: int64
splits:
- name: train
num_bytes: 14092944
num_examples: 2235
- name: validation
num_bytes: 841985
num_examples: 124
- name: test
num_bytes: 797196
num_examples: 125
download_size: 7964757
dataset_size: 15732125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_tcapelle__gemma-7b-zephyr-sft | ---
pretty_name: Evaluation run of tcapelle/gemma-7b-zephyr-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [tcapelle/gemma-7b-zephyr-sft](https://huggingface.co/tcapelle/gemma-7b-zephyr-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tcapelle__gemma-7b-zephyr-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T00:36:15.974163](https://huggingface.co/datasets/open-llm-leaderboard/details_tcapelle__gemma-7b-zephyr-sft/blob/main/results_2024-03-01T00-36-15.974163.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6033367728065783,\n\
\ \"acc_stderr\": 0.032882594243367605,\n \"acc_norm\": 0.606958636660939,\n\
\ \"acc_norm_stderr\": 0.03353521541447871,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4334943445434731,\n\
\ \"mc2_stderr\": 0.014653429969235831\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.6143344709897611,\n \"acc_norm_stderr\": 0.014224250973257182\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6063533160724955,\n\
\ \"acc_stderr\": 0.004875595792850676,\n \"acc_norm\": 0.8073093009360686,\n\
\ \"acc_norm_stderr\": 0.003936061455151114\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4603174603174603,\n \"acc_stderr\": 0.025670080636909186,\n \"\
acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.025670080636909186\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7354838709677419,\n \"acc_stderr\": 0.025091892378859275,\n \"\
acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.025091892378859275\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4433497536945813,\n \"acc_stderr\": 0.034953345821629324,\n \"\
acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.034953345821629324\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153314,\n\
\ \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153314\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.03210479051015776,\n \
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.03210479051015776\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.017604304149256476,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.017604304149256476\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.034624199316156214,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.034624199316156214\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281355,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281355\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6040462427745664,\n \"acc_stderr\": 0.02632981334194624,\n\
\ \"acc_norm\": 0.6040462427745664,\n \"acc_norm_stderr\": 0.02632981334194624\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.01442229220480884,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.01442229220480884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.673202614379085,\n \"acc_stderr\": 0.026857294663281416,\n\
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.026857294663281416\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4198174706649283,\n\
\ \"acc_stderr\": 0.012604960816087371,\n \"acc_norm\": 0.4198174706649283,\n\
\ \"acc_norm_stderr\": 0.012604960816087371\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213514,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213514\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882537,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.01615020132132301,\n \"mc2\": 0.4334943445434731,\n\
\ \"mc2_stderr\": 0.014653429969235831\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4981046247156937,\n \
\ \"acc_stderr\": 0.013772385765569753\n }\n}\n```"
repo_url: https://huggingface.co/tcapelle/gemma-7b-zephyr-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-36-15.974163.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-36-15.974163.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- '**/details_harness|winogrande|5_2024-03-01T00-36-15.974163.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T00-36-15.974163.parquet'
- config_name: results
data_files:
- split: 2024_03_01T00_36_15.974163
path:
- results_2024-03-01T00-36-15.974163.parquet
- split: latest
path:
- results_2024-03-01T00-36-15.974163.parquet
---
# Dataset Card for Evaluation run of tcapelle/gemma-7b-zephyr-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [tcapelle/gemma-7b-zephyr-sft](https://huggingface.co/tcapelle/gemma-7b-zephyr-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tcapelle__gemma-7b-zephyr-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T00:36:15.974163](https://huggingface.co/datasets/open-llm-leaderboard/details_tcapelle__gemma-7b-zephyr-sft/blob/main/results_2024-03-01T00-36-15.974163.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6033367728065783,
"acc_stderr": 0.032882594243367605,
"acc_norm": 0.606958636660939,
"acc_norm_stderr": 0.03353521541447871,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132301,
"mc2": 0.4334943445434731,
"mc2_stderr": 0.014653429969235831
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.6143344709897611,
"acc_norm_stderr": 0.014224250973257182
},
"harness|hellaswag|10": {
"acc": 0.6063533160724955,
"acc_stderr": 0.004875595792850676,
"acc_norm": 0.8073093009360686,
"acc_norm_stderr": 0.003936061455151114
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.025670080636909186,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.025670080636909186
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.025091892378859275,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.025091892378859275
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.034953345821629324,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.034953345821629324
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153314,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153314
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.03210479051015776,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.03210479051015776
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.017604304149256476,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.017604304149256476
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.034624199316156214,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.034624199316156214
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281355,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281355
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6040462427745664,
"acc_stderr": 0.02632981334194624,
"acc_norm": 0.6040462427745664,
"acc_norm_stderr": 0.02632981334194624
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.01442229220480884,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.01442229220480884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.026857294663281416,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.026857294663281416
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419994,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4198174706649283,
"acc_stderr": 0.012604960816087371,
"acc_norm": 0.4198174706649283,
"acc_norm_stderr": 0.012604960816087371
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213514,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213514
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882537,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.01615020132132301,
"mc2": 0.4334943445434731,
"mc2_stderr": 0.014653429969235831
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972384
},
"harness|gsm8k|5": {
"acc": 0.4981046247156937,
"acc_stderr": 0.013772385765569753
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
TinyPixel/dolphin-2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1623415440
num_examples: 891857
download_size: 884160758
dataset_size: 1623415440
---
# Dataset Card for "dolphin-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Adapting/German-Recipes | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8620054
num_examples: 1
- name: validation
num_bytes: 1561445
num_examples: 1
download_size: 5047180
dataset_size: 10181499
---
# Dataset Card for "German-Recipes"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
maidalun1020/CrosslingualRetrievalLawZh2En | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: queries
path: data/queries-*
- split: corpus
path: data/corpus-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: queries
num_bytes: 5815644
num_examples: 26653
- name: corpus
num_bytes: 6672439
num_examples: 4897
download_size: 6825183
dataset_size: 12488083
---
|
loubnabnl/pii_checks_data_elm | ---
dataset_info:
features:
- name: entities
list:
- name: context
dtype: string
- name: end
dtype: int64
- name: score
dtype: float64
- name: start
dtype: int64
- name: tag
dtype: string
- name: value
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_count
dtype: int64
- name: content
dtype: string
- name: id
dtype: string
- name: new_content
dtype: string
- name: modified
dtype: bool
- name: references
dtype: string
- name: language
dtype: string
splits:
- name: train
num_bytes: 37294354
num_examples: 1000
download_size: 13081864
dataset_size: 37294354
---
# Dataset Card for "pii_checks_data_elm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shreyasharma/sentences_truthv2 | ---
dataset_info:
features:
- name: sentences
dtype: string
- name: labels
dtype: int64
splits:
- name: train
num_bytes: 445901
num_examples: 7000
- name: test
num_bytes: 134824
num_examples: 2000
- name: val
num_bytes: 151364
num_examples: 2528
download_size: 254696
dataset_size: 732089
---
# Dataset Card for "sentences_truthv2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rishi-19/Mitwa_dpo_6k | ---
tags:
- croissant
--- |
adam89/TinyStoriesChinese | ---
license: cdla-sharing-1.0
---
[TinyStories](https://huggingface.co/datasets/roneneldan/TinyStories)数据集的中文翻译版。只翻译了`story`字段(翻译后字段为`story_zh`):
```json
{
"story": "\n\nLily and Ben are friends. They like to play in the park. One day, they see a big tree with a swing. Lily wants to try the swing. She runs to the tree and climbs on the swing.\n\"Push me, Ben!\" she says. Ben pushes her gently. Lily feels happy. She swings higher and higher. She laughs and shouts.\nBen watches Lily. He thinks she is cute. He wants to swing too. He waits for Lily to stop. But Lily does not stop. She swings faster and faster. She is having too much fun.\n\"Can I swing too, Lily?\" Ben asks. Lily does not hear him. She is too busy swinging. Ben feels sad. He walks away.\nLily swings so high that she loses her grip. She falls off the swing. She lands on the ground. She hurts her foot. She cries.\n\"Ow, ow, ow!\" she says. She looks for Ben. She wants him to help her. But Ben is not there. He is gone.\nLily feels sorry. She wishes she had shared the swing with Ben. She wishes he was there to hug her. She limps to the tree. She sees something hanging from a branch. It is Ben's hat. He left it for her.\nLily smiles. She thinks Ben is nice. She puts on his hat. She hopes he will come back. She wants to say sorry. She wants to be friends again.",
"instruction": {
"prompt:": "Write a short story (3-5 paragraphs) which only uses very simple words that a 3 year old child would understand. The story should use the verb \"hang\", the noun \"foot\" and the adjective \"cute\". The story has the following features: the story should contain at least one dialogue. Remember to only use simple words!\n\nPossible story:",
"words": [
"hang",
"foot",
"cute"
],
"features": [
"Dialogue"
]
},
"summary": "Lily and Ben play in the park and Lily gets too caught up in swinging, causing Ben to leave. Lily falls off the swing and hurts herself, but Ben leaves his hat for her as a kind gesture.",
"source": "GPT-4",
"story_zh": "莉莉和本是朋友。他们喜欢在公园里玩。有一天,他们在一棵大树下看到了一个秋千。莉莉想试试那个秋千。她跑到树下,爬上了秋千。\n\"推我,本!\"她说。本轻轻地推了她一下。莉莉感到很开心。她越荡越高,笑着喊叫。\n本看着莉莉。他觉得她很可爱。他也想荡秋千。他在莉莉停下来之后等着。但是莉莉没有停下来。她越荡越快。她玩得太高兴了。\n\"我也可以荡秋千吗,莉莉?\"本问。莉莉没听到他的话。她忙着荡秋千。本觉得很难过。他走开了。\n莉莉荡得太高,失去了平衡。她从秋千上摔下来,落在地上。她扭伤了脚。她哭了起来。\n\"哎呀,哎呀,哎呀!\"她说。她在找本。她希望他能帮助她。但本不在那里。他走了。\n莉莉感到很抱歉。她希望她能和本分享秋千。她希望他在那里拥抱她。她一瘸一拐地走到树下。她看到有什么东西挂在树枝上。那是本的帽子。他留给她的。\n莉莉笑了。她觉得本很好。她戴上了他的帽子。她希望他会回来。她想道歉。她想再次成为朋友。"
}
```
可以看一下翻译效果:
```text
Lily and Ben are friends. They like to play in the park. One day, they see a big tree with a swing. Lily wants to try the swing. She runs to the tree and climbs on the swing.
"Push me, Ben!" she says. Ben pushes her gently. Lily feels happy. She swings higher and higher. She laughs and shouts.
Ben watches Lily. He thinks she is cute. He wants to swing too. He waits for Lily to stop. But Lily does not stop. She swings faster and faster. She is having too much fun.
"Can I swing too, Lily?" Ben asks. Lily does not hear him. She is too busy swinging. Ben feels sad. He walks away.
Lily swings so high that she loses her grip. She falls off the swing. She lands on the ground. She hurts her foot. She cries.
"Ow, ow, ow!" she says. She looks for Ben. She wants him to help her. But Ben is not there. He is gone.
Lily feels sorry. She wishes she had shared the swing with Ben. She wishes he was there to hug her. She limps to the tree. She sees something hanging from a branch. It is Ben's hat. He left it for her.
Lily smiles. She thinks Ben is nice. She puts on his hat. She hopes he will come back. She wants to say sorry. She wants to be friends again.
莉莉和本是朋友。他们喜欢在公园里玩。有一天,他们在一棵大树下看到了一个秋千。莉莉想试试那个秋千。她跑到树下,爬上了秋千。
"推我,本!"她说。本轻轻地推了她一下。莉莉感到很开心。她越荡越高,笑着喊叫。
本看着莉莉。他觉得她很可爱。他也想荡秋千。他在莉莉停下来之后等着。但是莉莉没有停下来。她越荡越快。她玩得太高兴了。
"我也可以荡秋千吗,莉莉?"本问。莉莉没听到他的话。她忙着荡秋千。本觉得很难过。他走开了。
莉莉荡得太高,失去了平衡。她从秋千上摔下来,落在地上。她扭伤了脚。她哭了起来。
"哎呀,哎呀,哎呀!"她说。她在找本。她希望他能帮助她。但本不在那里。他走了。
莉莉感到很抱歉。她希望她能和本分享秋千。她希望他在那里拥抱她。她一瘸一拐地走到树下。她看到有什么东西挂在树枝上。那是本的帽子。他留给她的。
莉莉笑了。她觉得本很好。她戴上了他的帽子。她希望他会回来。她想道歉。她想再次成为朋友。
```
```text
Once upon a time, there was a little girl named Lily. She had a teddy bear that she loved so much. One day, she lost it while playing in the park. She looked everywhere, but she couldn't find it. She felt sad and scared without her teddy bear.
Lily's mommy saw her crying and asked what was wrong. Lily told her that she lost her teddy bear. Mommy hugged her and said, "Don't worry, we'll search for it together." They went back to the park and looked everywhere. After a while, they found the teddy bear under a tree. Lily was so happy!
She hugged her teddy bear and felt comfortable again. She said, "I hope I never lose you again, teddy bear." Mommy smiled and said, "Me too, Lily. You and teddy bear are the best of friends." And they all went home, happy and content. The end.
从前,有一个小女孩叫莉莉。她非常喜欢她的泰迪熊。有一天,她在公园里玩时把它弄丢了。她找遍了所有地方,但仍然找不到它。没有她的泰迪熊,她感到很难过和害怕。
莉莉的妈妈看到她哭泣,问她发生了什么事。莉莉告诉她自己把泰迪熊弄丢了。妈妈抱住她说:“别担心,我们会一起去找的。”他们回到公园,到处寻找。过了一会儿,他们在树下找到了泰迪熊。莉莉非常高兴!
她拥抱了她的泰迪熊,感觉又舒服了。她说:“我希望我再也不要失去你,泰迪熊。”妈妈笑着说:“我也这么想,莉莉。你和泰迪熊是最好的朋友。”然后他们都高高兴兴地回家了,感到非常满足。结束。
```
```text
Once upon a time, there was a cute puppy named Max. Max was very adorable with his big, brown eyes and wagging tail. One day, Max's owner, Emily, told him that they needed to go to the post office to mail a letter. Max didn't know what that meant, but he was excited to go for a car ride.
At the post office, Emily gave the letter to the nice lady behind the desk. The lady asked Emily for a number and Emily gave her one. Max didn't know what a number was, but he saw the lady type something on the computer.
After they mailed the letter, Emily and Max went back to the car. Max was happy that they went on an adventure and he couldn't wait for the next one.
从前,有一只可爱的狗狗名叫Max。Max 非常可爱,大大的棕色眼睛和摇摆的尾巴都让人喜欢。有一天,Emily告诉Max他们需要去邮局寄一封信。Max并不知道那是什么意思,但他很兴奋能去兜风。
在邮局,Emily把信交给柜台后面友好的女士。女士问Emily要了一个号码,Emily给了她一个。Max并不知道什么是号码,但看到女士在电脑上输入了一些东西。
寄完信后,Emily和Max回到了车里。Max很高兴他们去了一趟冒险,他迫不及待地期待着下一次冒险。
```
```text
One day, a kind and honest cat named Tom found a pretty velvet ribbon. He wanted to hang it on his door. But when he tried to hang it, he saw his friend, a small bird named Sue, was sad.
Sue wanted the velvet ribbon too. Tom did not want to make Sue sad. So, he thought of a way to share the ribbon. He knew that they both liked to play games.
Tom said, "Let's play a game. We can take turns to have the ribbon. Today, I will hang it on my door. Tomorrow, you can hang it on your tree."
Sue liked this idea. They played and shared the velvet ribbon every day. Tom and Sue were both happy. They learned that sharing is a good way to solve problems and stay friends.
一天,一只名叫汤姆的善良诚实的猫发现了一条漂亮的天鹅绒彩带。他想把它挂在门上。但当他尝试挂上时,看到他的朋友,一只名叫苏的小鸟,很伤心。
苏也想要这条天鹅绒彩带。汤姆不想让苏伤心。所以,他想到了一个分享彩带的方法。他知道他们都喜欢玩游戏。
汤姆说:“我们来玩个游戏吧。我们可以轮流拥有这条彩带。今天,我把它挂在我门上。明天,你可以把它挂在你树上。”
苏喜欢这个主意。他们每天都会玩游戏并分享这根天鹅绒彩带。汤姆和苏都很开心。他们学会了分享是一种解决问题的良好方法,也是保持友谊的好方法。
``` |
dapooni/sorsolingo_en_bsl | ---
dataset_info:
features:
- name: en
dtype: string
- name: bsl
dtype: string
splits:
- name: train
num_bytes: 163845
num_examples: 3303
download_size: 117966
dataset_size: 163845
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ayan1988/diffusion.maobi3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: txt
dtype: string
splits:
- name: train
num_bytes: 15526635.0
num_examples: 319
download_size: 14468827
dataset_size: 15526635.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nomadsb212/images | ---
license: mit
---
|
joey234/mmlu-high_school_us_history-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 81951
num_examples: 204
download_size: 47725
dataset_size: 81951
---
# Dataset Card for "mmlu-high_school_us_history-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roupenminassian/vehicle-dataset-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: image_id
dtype: int64
- name: width
dtype: int64
- name: height
dtype: int64
- name: objects
struct:
- name: id
sequence: int64
- name: area
sequence: float64
- name: bbox
sequence:
sequence: float64
- name: category
sequence: int64
splits:
- name: train
num_bytes: 120781140.624
num_examples: 1128
download_size: 122076069
dataset_size: 120781140.624
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vehicle-dataset-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
multiset/recordings_20230702_20230715 | ---
license: mit
---
This dataset consists of real-world recordings placed in several locations in Hangzhou, China. Researchers may find it useful for fine-tuning Speech-to-Text models. For privacy reasons, the file is encrypted, and anyone interested in the dataset should contact the uploader for the access key. However, access will only be granted to those who can provide a reasonable explanation of their intended use of the data and agree to respect the privacy of the individuals in the recordings. |
Tippawan/SNMCT | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 34094
num_examples: 4
- name: validation
num_bytes: 7484
num_examples: 1
- name: test
num_bytes: 3581
num_examples: 1
download_size: 14098
dataset_size: 45159
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_crumb__nano-mistral | ---
pretty_name: Evaluation run of crumb/nano-mistral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [crumb/nano-mistral](https://huggingface.co/crumb/nano-mistral) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_crumb__nano-mistral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:27:55.291023](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__nano-mistral/blob/main/results_2024-03-10T00-27-55.291023.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2511203777273379,\n\
\ \"acc_stderr\": 0.03056453888033514,\n \"acc_norm\": 0.2515472364859563,\n\
\ \"acc_norm_stderr\": 0.03137819323372588,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.4741805949532385,\n\
\ \"mc2_stderr\": 0.015622084311020428\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.17747440273037543,\n \"acc_stderr\": 0.01116513876964396,\n\
\ \"acc_norm\": 0.2167235494880546,\n \"acc_norm_stderr\": 0.01204015671348119\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27743477394941246,\n\
\ \"acc_stderr\": 0.004468178273665653,\n \"acc_norm\": 0.28520215096594304,\n\
\ \"acc_norm_stderr\": 0.004505879084606852\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198813,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198813\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3032258064516129,\n \"acc_stderr\": 0.02614868593067175,\n \"\
acc_norm\": 0.3032258064516129,\n \"acc_norm_stderr\": 0.02614868593067175\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"\
acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178253,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178253\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361273,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361273\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n\
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501954,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914387,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914387\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.01532988894089986,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.01532988894089986\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.02342037547829613,\n\
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.02342037547829613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.02282731749105969,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.02282731749105969\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.02240967454730418,\n\
\ \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02240967454730418\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307703,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307703\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927235,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.03027332507734575,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.03027332507734575\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.040139645540727756,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.040139645540727756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307744,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307744\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409214,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409214\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.4741805949532385,\n\
\ \"mc2_stderr\": 0.015622084311020428\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5240726124704025,\n \"acc_stderr\": 0.014036189665395129\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/crumb/nano-mistral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|winogrande|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-27-55.291023.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- results_2024-03-10T00-27-55.291023.parquet
- split: latest
path:
- results_2024-03-10T00-27-55.291023.parquet
---
# Dataset Card for Evaluation run of crumb/nano-mistral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [crumb/nano-mistral](https://huggingface.co/crumb/nano-mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_crumb__nano-mistral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:27:55.291023](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__nano-mistral/blob/main/results_2024-03-10T00-27-55.291023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2511203777273379,
"acc_stderr": 0.03056453888033514,
"acc_norm": 0.2515472364859563,
"acc_norm_stderr": 0.03137819323372588,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.4741805949532385,
"mc2_stderr": 0.015622084311020428
},
"harness|arc:challenge|25": {
"acc": 0.17747440273037543,
"acc_stderr": 0.01116513876964396,
"acc_norm": 0.2167235494880546,
"acc_norm_stderr": 0.01204015671348119
},
"harness|hellaswag|10": {
"acc": 0.27743477394941246,
"acc_stderr": 0.004468178273665653,
"acc_norm": 0.28520215096594304,
"acc_norm_stderr": 0.004505879084606852
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198813,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198813
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102146,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178253,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178253
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361273,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361273
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.2,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914387,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914387
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.01532988894089986,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.01532988894089986
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.02282731749105969,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.02282731749105969
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02240967454730418,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02240967454730418
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307703,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307703
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927235,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727756,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409214,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409214
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.4741805949532385,
"mc2_stderr": 0.015622084311020428
},
"harness|winogrande|5": {
"acc": 0.5240726124704025,
"acc_stderr": 0.014036189665395129
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct | ---
pretty_name: Evaluation run of Norquinal/Mistral-7B-claude-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Norquinal/Mistral-7B-claude-instruct](https://huggingface.co/Norquinal/Mistral-7B-claude-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T15:34:36.635642](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public/blob/main/results_2023-11-13T15-34-36.635642.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6328012974181245,\n\
\ \"acc_stderr\": 0.032347704149397305,\n \"acc_norm\": 0.6418533753559277,\n\
\ \"acc_norm_stderr\": 0.03304428598840875,\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.4747061071538381,\n\
\ \"mc2_stderr\": 0.014816247527686706,\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298484,\n \"f1\": 0.06348154362416109,\n\
\ \"f1_stderr\": 0.0013886897198441997\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.01430175222327954,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168484\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6502688707428799,\n\
\ \"acc_stderr\": 0.00475910343238076,\n \"acc_norm\": 0.8499302927703645,\n\
\ \"acc_norm_stderr\": 0.003564098420387769\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n \
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465718,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465718\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.016785481159203627,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.016785481159203627\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.033922384053216174,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.033922384053216174\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092382,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247337,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n\
\ \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n\
\ \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n\
\ \"acc_stderr\": 0.012685906538206242,\n \"acc_norm\": 0.4426336375488918,\n\
\ \"acc_norm_stderr\": 0.012685906538206242\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406752,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406752\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360375,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360375\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291286,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291286\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.036845294917747115,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.036845294917747115\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n\
\ \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.4747061071538381,\n\
\ \"mc2_stderr\": 0.014816247527686706\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773239\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \
\ \"em_stderr\": 0.0003921042190298484,\n \"f1\": 0.06348154362416109,\n\
\ \"f1_stderr\": 0.0013886897198441997\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.17968157695223655,\n \"acc_stderr\": 0.010575119964242251\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Norquinal/Mistral-7B-claude-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|drop|3_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T15-34-36.635642.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- '**/details_harness|winogrande|5_2023-11-13T15-34-36.635642.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T15-34-36.635642.parquet'
- config_name: results
data_files:
- split: 2023_11_13T15_34_36.635642
path:
- results_2023-11-13T15-34-36.635642.parquet
- split: latest
path:
- results_2023-11-13T15-34-36.635642.parquet
---
# Dataset Card for Evaluation run of Norquinal/Mistral-7B-claude-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Norquinal/Mistral-7B-claude-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Norquinal/Mistral-7B-claude-instruct](https://huggingface.co/Norquinal/Mistral-7B-claude-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T15:34:36.635642](https://huggingface.co/datasets/open-llm-leaderboard/details_Norquinal__Mistral-7B-claude-instruct_public/blob/main/results_2023-11-13T15-34-36.635642.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6328012974181245,
"acc_stderr": 0.032347704149397305,
"acc_norm": 0.6418533753559277,
"acc_norm_stderr": 0.03304428598840875,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.4747061071538381,
"mc2_stderr": 0.014816247527686706,
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298484,
"f1": 0.06348154362416109,
"f1_stderr": 0.0013886897198441997
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.01430175222327954,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168484
},
"harness|hellaswag|10": {
"acc": 0.6502688707428799,
"acc_stderr": 0.00475910343238076,
"acc_norm": 0.8499302927703645,
"acc_norm_stderr": 0.003564098420387769
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465718,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465718
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.016785481159203627,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.016785481159203627
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.033922384053216174,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.033922384053216174
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092382,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247337,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206242,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406752,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406752
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360375,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360375
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291286,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.036845294917747115,
"acc_norm": 0.84,
"acc_norm_stderr": 0.036845294917747115
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.4747061071538381,
"mc2_stderr": 0.014816247527686706
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773239
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298484,
"f1": 0.06348154362416109,
"f1_stderr": 0.0013886897198441997
},
"harness|gsm8k|5": {
"acc": 0.17968157695223655,
"acc_stderr": 0.010575119964242251
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
repllabs/questions_how_to_do_great_work | ---
configs:
- config_name: default
data_files:
- split: processed
path: data/processed-*
- split: raw
path: data/raw-*
dataset_info:
features:
- name: question
dtype: string
- name: model
dtype: string
splits:
- name: processed
num_bytes: 17391
num_examples: 142
- name: raw
num_bytes: 55307
num_examples: 450
download_size: 28702
dataset_size: 72698
license: mit
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
---
# Questions Generated by LLM on 'How To Do Great Work'
http://paulgraham.com/greatwork.html
https://github.com/fastrepl/fastrepl/blob/main/exp/pg_essay_questions.ipynb |
Reacubeth/Quintuple | ---
license: gpl-3.0
---
## Homepage
Exploring and Verbalizing Academic Ideas by Concept Co-occurrence
[https://github.com/xyjigsaw/Kiscovery](https://github.com/xyjigsaw/Kiscovery)
## Co-occurrence Citation Quintuple
It is the official **Co-occurrence Citation Quintuple** dataset of paper *Exploring and Verbalizing Academic Ideas by Concept Co-occurrence*.
We construct and release a dataset of co-occurrence citation quintuples, which is used to train text generation model for idea verbalization. The process of identifying and processing concepts is similar to constructing the concept co-occurrence graph. Heuristic rules are adopted to filter redundant and noisy sentences, further improving the quality of the quintuples used for idea generation. More details of co-occurrence citation quintuples can be found in Appendix B, C, and J.
In mid-2023, our DeepReport system underwent a major update, encompassing both data and model improvements. On the data front, we introduced a new version of the quintuple data (V202306), resulting in enhanced quality and a larger-scale dataset. The statistical summary of the new quintuple data (V202306) is presented as follows:
| Discipline | Quintuple | Concept | Concept Pair | Total $p$ | Total $p_1$ \& $p_2$ |
| --------------------- | --------- | ------- | ------------ | --------- | -------------------- |
| Art | 7,510 | 2,671 | 5,845 | 2,770 | 7,060 |
| History | 5,287 | 2,198 | 4,654 | 2,348 | 5,764 |
| Philosophy | 45,752 | 4,773 | 25,935 | 16,896 | 29,942 |
| Sociology | 16,017 | 4,054 | 12,796 | 7,066 | 16,416 |
| Political Science | 67,975 | 6,105 | 42,411 | 26,198 | 53,933 |
| Business | 205,297 | 9,608 | 99,329 | 62,332 | 112,736 |
| Geography | 191,958 | 12,029 | 118,563 | 42,317 | 112,909 |
| Engineering | 506,635 | 16,992 | 249,935 | 137,164 | 273,894 |
| Geology | 365,183 | 13,795 | 190,002 | 98,991 | 222,358 |
| Medicine | 168,697 | 13,014 | 114,104 | 42,535 | 138,973 |
| Economics | 227,530 | 9,461 | 113,527 | 68,607 | 131,387 |
| Physics | 267,532 | 10,831 | 133,079 | 84,824 | 176,741 |
| Biology | 224,722 | 15,119 | 145,088 | 59,210 | 189,281 |
| Mathematics | 312,670 | 17,751 | 190,734 | 95,951 | 218,697 |
| Psychology | 476,342 | 9,512 | 194,038 | 115,725 | 212,180 |
| Computer Science | 531,654 | 16,591 | 244,567 | 151,809 | 238,091 |
| Environmental Science | 583,466 | 11,002 | 226,671 | 94,474 | 201,330 |
| Materials Science | 573,032 | 17,098 | 249,251 | 145,068 | 313,657 |
| Chemistry | 565,307 | 13,858 | 231,062 | 108,637 | 286,593 |
| **Total** | 5,342,566 | 206,462 | 2,591,591 | 1,362,922 | 2,941,942 |
Note that each file is a list in the form of [[c_1, c_2, p, p_1, p_2], ...], the element of which is a quintuple. c_1 and c_2 are concepts, p is the target texts, i.e., the verbalized ideas.
Download with git
```bash
sudo apt-get install git-lfs
# OR
brew install git-lfs
git lfs install
git clone https://huggingface.co/datasets/Reacubeth/Co-occurrenceCitationQuintuple
```
## Citation
If you use our work in your research or publication, please cite us as follows:
```
@inproceedings{xu2023exploring,
title={Exploring and Verbalizing Academic Ideas by Concept Co-occurrence},
author={Xu, Yi and Sheng, Shuqian and Xue, Bo and Fu, Luoyi and Wang, Xinbing and Zhou, Chenghu},
booktitle={Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL)},
year={2023}
}
```
Please let us know if you have any questions or feedback. Thank you for your interest in our work!
|
odunola/yoruba_audio_data | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 4040001972.75
num_examples: 11506
download_size: 4137205952
dataset_size: 4040001972.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.