datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-random-100000 | ---
pretty_name: Evaluation run of NLUHOPOE/Mistral-7B-random-100000
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/Mistral-7B-random-100000](https://huggingface.co/NLUHOPOE/Mistral-7B-random-100000)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-random-100000\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T09:57:50.433664](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-random-100000/blob/main/results_2024-01-26T09-57-50.433664.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5312649011937841,\n\
\ \"acc_stderr\": 0.03412886748292059,\n \"acc_norm\": 0.5384137413136626,\n\
\ \"acc_norm_stderr\": 0.03490969986024582,\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.015826142439502356,\n \"mc2\": 0.43163352782122394,\n\
\ \"mc2_stderr\": 0.014658079708747593\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4948805460750853,\n \"acc_stderr\": 0.014610624890309157,\n\
\ \"acc_norm\": 0.537542662116041,\n \"acc_norm_stderr\": 0.014570144495075576\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5836486755626369,\n\
\ \"acc_stderr\": 0.004919457850104236,\n \"acc_norm\": 0.7859988050189205,\n\
\ \"acc_norm_stderr\": 0.004092894578418981\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5131578947368421,\n \"acc_stderr\": 0.04067533136309172,\n\
\ \"acc_norm\": 0.5131578947368421,\n \"acc_norm_stderr\": 0.04067533136309172\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651281,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651281\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425086,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425086\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5806451612903226,\n \"acc_stderr\": 0.028071588901091838,\n \"\
acc_norm\": 0.5806451612903226,\n \"acc_norm_stderr\": 0.028071588901091838\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3793103448275862,\n \"acc_stderr\": 0.034139638059062345,\n \"\
acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.034139638059062345\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933886,\n\
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933886\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5504201680672269,\n \"acc_stderr\": 0.03231293497137707,\n \
\ \"acc_norm\": 0.5504201680672269,\n \"acc_norm_stderr\": 0.03231293497137707\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.01934203658770258,\n \"\
acc_norm\": 0.7155963302752294,\n \"acc_norm_stderr\": 0.01934203658770258\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.03388857118502325,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.03388857118502325\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.696078431372549,\n \"acc_stderr\": 0.032282103870378935,\n \"\
acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.032282103870378935\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7257383966244726,\n \"acc_stderr\": 0.029041333510598035,\n \
\ \"acc_norm\": 0.7257383966244726,\n \"acc_norm_stderr\": 0.029041333510598035\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.03252113489929188,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.03252113489929188\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.047500773411999854,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.047500773411999854\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n\
\ \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.7381864623243933,\n\
\ \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.02688264343402289,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.02688264343402289\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.014816119635317012,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.014816119635317012\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3754889178617992,\n\
\ \"acc_stderr\": 0.012367945396728213,\n \"acc_norm\": 0.3754889178617992,\n\
\ \"acc_norm_stderr\": 0.012367945396728213\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5147058823529411,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7251461988304093,\n \"acc_stderr\": 0.03424042924691583,\n\
\ \"acc_norm\": 0.7251461988304093,\n \"acc_norm_stderr\": 0.03424042924691583\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n\
\ \"mc1_stderr\": 0.015826142439502356,\n \"mc2\": 0.43163352782122394,\n\
\ \"mc2_stderr\": 0.014658079708747593\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908189\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12964366944655042,\n \
\ \"acc_stderr\": 0.00925265775782556\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/Mistral-7B-random-100000
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|arc:challenge|25_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|gsm8k|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hellaswag|10_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T09-57-50.433664.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T09-57-50.433664.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- '**/details_harness|winogrande|5_2024-01-26T09-57-50.433664.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T09-57-50.433664.parquet'
- config_name: results
data_files:
- split: 2024_01_26T09_57_50.433664
path:
- results_2024-01-26T09-57-50.433664.parquet
- split: latest
path:
- results_2024-01-26T09-57-50.433664.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/Mistral-7B-random-100000
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/Mistral-7B-random-100000](https://huggingface.co/NLUHOPOE/Mistral-7B-random-100000) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-random-100000",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T09:57:50.433664](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__Mistral-7B-random-100000/blob/main/results_2024-01-26T09-57-50.433664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5312649011937841,
"acc_stderr": 0.03412886748292059,
"acc_norm": 0.5384137413136626,
"acc_norm_stderr": 0.03490969986024582,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502356,
"mc2": 0.43163352782122394,
"mc2_stderr": 0.014658079708747593
},
"harness|arc:challenge|25": {
"acc": 0.4948805460750853,
"acc_stderr": 0.014610624890309157,
"acc_norm": 0.537542662116041,
"acc_norm_stderr": 0.014570144495075576
},
"harness|hellaswag|10": {
"acc": 0.5836486755626369,
"acc_stderr": 0.004919457850104236,
"acc_norm": 0.7859988050189205,
"acc_norm_stderr": 0.004092894578418981
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5131578947368421,
"acc_stderr": 0.04067533136309172,
"acc_norm": 0.5131578947368421,
"acc_norm_stderr": 0.04067533136309172
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651281,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651281
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425086,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425086
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5806451612903226,
"acc_stderr": 0.028071588901091838,
"acc_norm": 0.5806451612903226,
"acc_norm_stderr": 0.028071588901091838
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.034139638059062345,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.034139638059062345
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.025310639254933886,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.025310639254933886
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5504201680672269,
"acc_stderr": 0.03231293497137707,
"acc_norm": 0.5504201680672269,
"acc_norm_stderr": 0.03231293497137707
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.01934203658770258,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.01934203658770258
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.032282103870378935,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.032282103870378935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7257383966244726,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.7257383966244726,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.03252113489929188,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.03252113489929188
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.047500773411999854,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.047500773411999854
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.02688264343402289,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.02688264343402289
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.014816119635317012,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.014816119635317012
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3754889178617992,
"acc_stderr": 0.012367945396728213,
"acc_norm": 0.3754889178617992,
"acc_norm_stderr": 0.012367945396728213
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7251461988304093,
"acc_stderr": 0.03424042924691583,
"acc_norm": 0.7251461988304093,
"acc_norm_stderr": 0.03424042924691583
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502356,
"mc2": 0.43163352782122394,
"mc2_stderr": 0.014658079708747593
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908189
},
"harness|gsm8k|5": {
"acc": 0.12964366944655042,
"acc_stderr": 0.00925265775782556
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
randomwalksky/cup | ---
license: openrail
---
|
imperialwarrior/open-australian-legal-qa-paraphrased-hard-gemini | ---
dataset_info:
features:
- name: index
dtype: 'null'
- name: pipeline_1_result
dtype: string
- name: pipeline_1_result_embeddings
dtype: string
- name: pipeline_2_context
dtype: string
- name: pipeline_2_result
dtype: string
- name: pipeline_2_result_embeddings
dtype: string
- name: pipeline_3_context
dtype: string
- name: pipeline_3_result
dtype: string
- name: pipeline_3_result_embeddings
dtype: string
- name: pipeline_4_context
dtype: string
- name: pipeline_4_result
dtype: string
- name: pipeline_4_result_embeddings
dtype: string
- name: pipeline_5_context
dtype: string
- name: pipeline_5_result
dtype: string
- name: pipeline_5_result_embeddings
dtype: string
- name: pipeline_6_context
dtype: string
- name: pipeline_6_result
dtype: string
- name: pipeline_6_result_embeddings
dtype: string
- name: pipeline_7_context
dtype: string
- name: pipeline_7_result
dtype: string
- name: pipeline_7_result_embeddings
dtype: string
- name: referenced_question
dtype: string
- name: answer
dtype: string
- name: question
dtype: string
- name: question_non_retrieval_embeddings
dtype: string
- name: answer_non_retrieval_embeddings
dtype: string
- name: question_retrieval_embeddings
dtype: string
- name: answer_retrieval_embeddings
dtype: string
- name: __index_level_0__
dtype: float64
- name: case_index
dtype: float64
- name: pipeline_6_case_indexes
sequence: int64
- name: pipeline_7_case_indexes
sequence: int64
splits:
- name: train
num_bytes: 40967131
num_examples: 203
download_size: 14378490
dataset_size: 40967131
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zolak/twitter_dataset_79_1713085240 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2654461
num_examples: 6615
download_size: 1346680
dataset_size: 2654461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
botp/TigerResearch-pretrain_zh | ---
dataset_info:
features:
- name: dataType
dtype: string
- name: title
dtype: string
- name: content
dtype: string
- name: uniqueKey
dtype: string
- name: titleUkey
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 58043923125
num_examples: 16905023
download_size: 25662051889
dataset_size: 58043923125
duplicated_from: TigerResearch/pretrain_zh
---
# Dataset Card for "pretrain_zh"
[Tigerbot](https://github.com/TigerResearch/TigerBot) pretrain数据的中文部分。
包含(未压缩前) 中文书籍zh-books 12G, 中文互联网zh-webtext 25G, 中文百科zh-wiki 19G
更多语料请关注开源模型及持续更新 [https://github.com/TigerResearch/TigerBot](https://github.com/TigerResearch/TigerBot)
<p align="center" width="40%">
</p>
## Usage
```python
import datasets
ds_sft = datasets.load_dataset('TigerResearch/pretrain_zh')
``` |
sam-mosaic/evesix-llama-fmt | ---
dataset_info:
features:
- name: id
dtype: string
- name: prompt
dtype: string
- name: language
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 775938568
num_examples: 486455
download_size: 0
dataset_size: 775938568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "evesix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
freddyaboulton/gradio-image-urls | ---
license: mit
---
|
severo/doc-image-4 | ---
size_categories:
- n<1K
---
# [doc] image dataset 4
This dataset contains 4 jpg image files in the /data directory, with a CSV metadata file providing another data column.
|
cj-mills/cvat-keypoint-toy-dataset | ---
license: mit
---
|
ovior/twitter_dataset_1713091046 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2394653
num_examples: 6954
download_size: 1377260
dataset_size: 2394653
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jxie/qg-tagging | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6944726400
num_examples: 1600000
- name: val
num_bytes: 868957000
num_examples: 200000
- name: test
num_bytes: 868286700
num_examples: 200000
download_size: 3812296127
dataset_size: 8681970100
---
# Dataset Card for "qg-tagging"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WinterSchool/ROCO | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: conversations
struct:
- name: data
list:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 403971903.769
num_examples: 2101
download_size: 403037177
dataset_size: 403971903.769
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
language:
- en
tags:
- medical
size_categories:
- 1K<n<10K
---
This dataset is made using raw data from [ROCO(Radiology Objects in COntext)]( https://www.semanticscholar.org/paper/Radiology-Objects-in-COntext-(ROCO)%3A-A-Multimodal-Pelka-Koitka/a564fabf130ff6e2742cfba90c7a4018937d764d), a multimodal image dataset, with the aim of detecting the interplay between visual elements and semantic relations present in radiology images.
For each image in the original raw dataset we used the associated caption to generate a simulated conversation about the image between a user and a chatbot.
|
zyxleo/cord_donut_multitask | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: task
dtype: string
- name: image_path
dtype: string
- name: ground_truth
dtype: string
- name: labels
sequence: int64
- name: input_ids
sequence: int64
splits:
- name: train
num_bytes: 1260759
num_examples: 800
- name: test
num_bytes: 93059
num_examples: 100
- name: validation
num_bytes: 86619
num_examples: 100
download_size: 299877
dataset_size: 1440437
---
# Dataset Card for "cord_donut_multitask"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
phanvancongthanh/data_part02 | ---
dataset_info:
features:
- name: smiles
dtype: string
splits:
- name: train
num_bytes: 5777591178
num_examples: 138701675
download_size: 3034948930
dataset_size: 5777591178
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "data_part02"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loubnabnl/rmarkdown_checks | ---
dataset_info:
features:
- name: entities
list:
- name: context
dtype: string
- name: end
dtype: int64
- name: score
dtype: float32
- name: start
dtype: int64
- name: tag
dtype: string
- name: value
dtype: string
- name: max_stars_repo_path
dtype: string
- name: max_stars_repo_name
dtype: string
- name: max_stars_count
dtype: int64
- name: content
dtype: string
- name: id
dtype: string
- name: new_content
dtype: string
- name: modified
dtype: bool
- name: references
dtype: string
splits:
- name: train
num_bytes: 113015277.61771259
num_examples: 3493
download_size: 62607907
dataset_size: 113015277.61771259
---
# Dataset Card for "rmarkdown_checks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kanroji_mitsuri_demonslayer | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kanroji Mitsuri (Demon Slayer)
This is the dataset of Kanroji Mitsuri (Demon Slayer), containing 101 images and their tags.
The core tags of this character are `pink_hair, long_hair, multicolored_hair, green_hair, braid, mole, mole_under_eye, gradient_hair, green_eyes, twin_braids, breasts, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 101 | 73.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanroji_mitsuri_demonslayer/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 101 | 57.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanroji_mitsuri_demonslayer/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 189 | 109.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanroji_mitsuri_demonslayer/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 101 | 73.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanroji_mitsuri_demonslayer/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 189 | 133.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kanroji_mitsuri_demonslayer/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kanroji_mitsuri_demonslayer',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, solo, closed_mouth, demon_slayer_uniform, portrait, eyelashes, smile, looking_at_viewer |
| 1 | 23 |  |  |  |  |  | 1girl, long_sleeves, demon_slayer_uniform, solo, haori, black_skirt, holding_sword, cleavage, looking_at_viewer, open_clothes, jacket, miniskirt, pleated_skirt, two-tone_hair, katana, partially_unbuttoned, closed_mouth, wide_sleeves, green_thighhighs, ribbed_legwear, belt |
| 2 | 5 |  |  |  |  |  | 1girl, demon_slayer_uniform, holding_sword, solo, closed_mouth, katana, from_side, portrait, profile |
| 3 | 8 |  |  |  |  |  | 1girl, kimono, solo, looking_at_viewer, cleavage, collarbone, haori, upper_body, blush, floral_print, open_mouth, smile, tears |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | closed_mouth | demon_slayer_uniform | portrait | eyelashes | smile | looking_at_viewer | long_sleeves | haori | black_skirt | holding_sword | cleavage | open_clothes | jacket | miniskirt | pleated_skirt | two-tone_hair | katana | partially_unbuttoned | wide_sleeves | green_thighhighs | ribbed_legwear | belt | from_side | profile | kimono | collarbone | upper_body | blush | floral_print | open_mouth | tears |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:-----------------------|:-----------|:------------|:--------|:--------------------|:---------------|:--------|:--------------|:----------------|:-----------|:---------------|:---------|:------------|:----------------|:----------------|:---------|:-----------------------|:---------------|:-------------------|:-----------------|:-------|:------------|:----------|:---------|:-------------|:-------------|:--------|:---------------|:-------------|:--------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | | | | | | | X | | | | | | | X | | | | | | X | X | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | | | | X | X | | X | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
zhengxuanzenwu/fair_glue_qnli | ---
dataset_info:
features:
- name: question
dtype: string
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': not_entailment
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 25612443
num_examples: 104743
- name: validation
num_bytes: 250467.5086948563
num_examples: 1000
- name: test
num_bytes: 1368304
num_examples: 5463
download_size: 18562140
dataset_size: 27231214.508694857
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
miguelinc/oratorialab | ---
license: cc-by-sa-4.0
task_categories:
- image-classification
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HaawkeNeural/latent_diffusion | ---
license: other
---
|
CesarLeblanc/geoplantbert_fill_mask_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 881264679
num_examples: 900042
- name: test
num_bytes: 97851633
num_examples: 99958
download_size: 388245641
dataset_size: 979116312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
coref-data/litbank_indiscrim | ---
dataset_info:
- config_name: split_0
features:
- name: sentences
list:
- name: id
dtype: int64
- name: misc
struct:
- name: parse_tree
dtype: string
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: deprel
dtype: string
- name: end_char
dtype: int64
- name: feats
dtype: string
- name: head
dtype: int64
- name: id
dtype: int64
- name: lemma
dtype: string
- name: misc
dtype: string
- name: start_char
dtype: int64
- name: text
dtype: string
- name: upos
dtype: string
- name: xpos
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 66722053
num_examples: 80
- name: validation
num_bytes: 9538946
num_examples: 10
- name: test
num_bytes: 10206291
num_examples: 10
download_size: 44024474
dataset_size: 86467290
- config_name: split_1
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 51521261
num_examples: 80
- name: validation
num_bytes: 8300522
num_examples: 10
- name: test
num_bytes: 7127546
num_examples: 10
download_size: 40296693
dataset_size: 66949329
- config_name: split_2
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 51695718
num_examples: 80
- name: validation
num_bytes: 7127546
num_examples: 10
- name: test
num_bytes: 8126065
num_examples: 10
download_size: 40287905
dataset_size: 66949329
- config_name: split_3
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 52504381
num_examples: 80
- name: validation
num_bytes: 8126065
num_examples: 10
- name: test
num_bytes: 6318883
num_examples: 10
download_size: 40292412
dataset_size: 66949329
- config_name: split_4
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 54684836
num_examples: 80
- name: validation
num_bytes: 6318883
num_examples: 10
- name: test
num_bytes: 5945610
num_examples: 10
download_size: 40283365
dataset_size: 66949329
- config_name: split_5
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 53798360
num_examples: 80
- name: validation
num_bytes: 5945610
num_examples: 10
- name: test
num_bytes: 7205359
num_examples: 10
download_size: 40284379
dataset_size: 66949329
- config_name: split_6
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 53481831
num_examples: 80
- name: validation
num_bytes: 7205359
num_examples: 10
- name: test
num_bytes: 6262139
num_examples: 10
download_size: 40294155
dataset_size: 66949329
- config_name: split_7
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 54849391
num_examples: 80
- name: validation
num_bytes: 6262139
num_examples: 10
- name: test
num_bytes: 5837799
num_examples: 10
download_size: 40294847
dataset_size: 66949329
- config_name: split_8
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 56921350
num_examples: 80
- name: validation
num_bytes: 5837799
num_examples: 10
- name: test
num_bytes: 4190180
num_examples: 10
download_size: 40292974
dataset_size: 66949329
- config_name: split_9
features:
- name: sentences
list:
- name: id
dtype: int64
- name: speaker
dtype: 'null'
- name: text
dtype: string
- name: tokens
list:
- name: id
dtype: int64
- name: text
dtype: string
- name: coref_chains
sequence:
sequence:
sequence: int64
- name: id
dtype: string
- name: text
dtype: string
- name: genre
dtype: string
- name: meta_data
struct:
- name: author
dtype: string
- name: comment
dtype: string
- name: date
dtype: string
- name: gutenberg_id
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 55123923
num_examples: 80
- name: validation
num_bytes: 4190180
num_examples: 10
- name: test
num_bytes: 7635226
num_examples: 10
download_size: 40294593
dataset_size: 66949329
configs:
- config_name: split_0
data_files:
- split: train
path: split_0/train-*
- split: validation
path: split_0/validation-*
- split: test
path: split_0/test-*
- config_name: split_1
data_files:
- split: train
path: split_1/train-*
- split: validation
path: split_1/validation-*
- split: test
path: split_1/test-*
- config_name: split_2
data_files:
- split: train
path: split_2/train-*
- split: validation
path: split_2/validation-*
- split: test
path: split_2/test-*
- config_name: split_3
data_files:
- split: train
path: split_3/train-*
- split: validation
path: split_3/validation-*
- split: test
path: split_3/test-*
- config_name: split_4
data_files:
- split: train
path: split_4/train-*
- split: validation
path: split_4/validation-*
- split: test
path: split_4/test-*
- config_name: split_5
data_files:
- split: train
path: split_5/train-*
- split: validation
path: split_5/validation-*
- split: test
path: split_5/test-*
- config_name: split_6
data_files:
- split: train
path: split_6/train-*
- split: validation
path: split_6/validation-*
- split: test
path: split_6/test-*
- config_name: split_7
data_files:
- split: train
path: split_7/train-*
- split: validation
path: split_7/validation-*
- split: test
path: split_7/test-*
- config_name: split_8
data_files:
- split: train
path: split_8/train-*
- split: validation
path: split_8/validation-*
- split: test
path: split_8/test-*
- config_name: split_9
data_files:
- split: train
path: split_9/train-*
- split: validation
path: split_9/validation-*
- split: test
path: split_9/test-*
---
This dataset was generated by reformatting [`coref-data/litbank_raw`](https://huggingface.co/datasets/coref-data/litbank_raw) into the indiscrim coreference format. See that repo for dataset details.
See [ianporada/coref-data](https://github.com/ianporada/coref-data) for additional conversion details and the conversion script.
Please create an issue in the repo above or in this dataset repo for any questions.
|
CyberHarem/akutsu_ruri_ahogirl | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Akutsu Ruri
This is the dataset of Akutsu Ruri, containing 109 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 109 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 224 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 109 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 109 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 109 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 109 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 109 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 224 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 224 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 224 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
Innominate/LargeConvo2048 | ---
dataset_info:
features:
- name: input
dtype: string
splits:
- name: train
num_bytes: 1793023981
num_examples: 984989
download_size: 974905351
dataset_size: 1793023981
task_categories:
- text-generation
---
A large dataset to train Churro. Every element is under 2048 tokens, when tokenized using the LLaMA Tokenizer. |
KenBars/item_rec | ---
license: mit
---
|
qq371/11111 | ---
license: epl-2.0
---
|
FaalSa/f1 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 79710
num_examples: 1
- name: validation
num_bytes: 80190
num_examples: 1
- name: test
num_bytes: 80670
num_examples: 1
download_size: 69501
dataset_size: 240570
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
satyanshu404/MS-Marco-Prompt-generation | ---
license: unknown
---
|
andersonbcdefg/dolly-ai-filtered | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
- name: category
dtype: string
splits:
- name: train
num_bytes: 2939273
num_examples: 5444
download_size: 0
dataset_size: 2939273
---
# Dataset Card for "dolly-ai-filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JovialValley/phoneme_totalMapped0 | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 108844668
num_examples: 389
- name: test
num_bytes: 27494376
num_examples: 98
download_size: 137098876
dataset_size: 136339044
---
# Dataset Card for "phoneme_totalMapped0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
julienmercier/mobile-eye-tracking-dataset-v3 | ---
license: cc-by-nc-nd-4.0
---
|
huggingartists/joni-mitchell | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/joni-mitchell"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.703544 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/ed9a330b2539058076e0c48398599b09.1000x1000x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/joni-mitchell">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Joni Mitchell</div>
<a href="https://genius.com/artists/joni-mitchell">
<div style="text-align: center; font-size: 14px;">@joni-mitchell</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/joni-mitchell).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/joni-mitchell")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|559| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/joni-mitchell")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
KPrashanth/articles_dataset | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 13354657
num_examples: 3188
- name: validation
num_bytes: 3257643
num_examples: 798
- name: test
num_bytes: 4221414
num_examples: 997
download_size: 9383756
dataset_size: 20833714
task_categories:
- text-generation
- summarization
- text2text-generation
- text-classification
---
# Dataset Card for "articles_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_fblgit__una-cybertron-7b-v3-OMA | ---
pretty_name: Evaluation run of fblgit/una-cybertron-7b-v3-OMA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fblgit/una-cybertron-7b-v3-OMA](https://huggingface.co/fblgit/una-cybertron-7b-v3-OMA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fblgit__una-cybertron-7b-v3-OMA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T14:22:11.823260](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v3-OMA/blob/main/results_2023-12-16T14-22-11.823260.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6407887213707157,\n\
\ \"acc_stderr\": 0.032306194957506966,\n \"acc_norm\": 0.6401991329877219,\n\
\ \"acc_norm_stderr\": 0.03297436021123899,\n \"mc1\": 0.5801713586291309,\n\
\ \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.6984571807866093,\n\
\ \"mc2_stderr\": 0.01516400593831668\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7098976109215017,\n \"acc_stderr\": 0.013261573677520766,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869155\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7183827922724557,\n\
\ \"acc_stderr\": 0.0044886843979795015,\n \"acc_norm\": 0.8794064927305317,\n\
\ \"acc_norm_stderr\": 0.0032498873947065044\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.022755204959542943,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.022755204959542943\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479047,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479047\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516301,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516301\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371802,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371802\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n\
\ \"acc_stderr\": 0.016646914804438775,\n \"acc_norm\": 0.45251396648044695,\n\
\ \"acc_norm_stderr\": 0.016646914804438775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \
\ \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5801713586291309,\n\
\ \"mc1_stderr\": 0.017277030301775766,\n \"mc2\": 0.6984571807866093,\n\
\ \"mc2_stderr\": 0.01516400593831668\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \
\ \"acc_stderr\": 0.012880360794851815\n }\n}\n```"
repo_url: https://huggingface.co/fblgit/una-cybertron-7b-v3-OMA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|arc:challenge|25_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|gsm8k|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hellaswag|10_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-22-11.823260.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T14-22-11.823260.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- '**/details_harness|winogrande|5_2023-12-16T14-22-11.823260.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T14-22-11.823260.parquet'
- config_name: results
data_files:
- split: 2023_12_16T14_22_11.823260
path:
- results_2023-12-16T14-22-11.823260.parquet
- split: latest
path:
- results_2023-12-16T14-22-11.823260.parquet
---
# Dataset Card for Evaluation run of fblgit/una-cybertron-7b-v3-OMA
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fblgit/una-cybertron-7b-v3-OMA](https://huggingface.co/fblgit/una-cybertron-7b-v3-OMA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fblgit__una-cybertron-7b-v3-OMA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T14:22:11.823260](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__una-cybertron-7b-v3-OMA/blob/main/results_2023-12-16T14-22-11.823260.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6407887213707157,
"acc_stderr": 0.032306194957506966,
"acc_norm": 0.6401991329877219,
"acc_norm_stderr": 0.03297436021123899,
"mc1": 0.5801713586291309,
"mc1_stderr": 0.017277030301775766,
"mc2": 0.6984571807866093,
"mc2_stderr": 0.01516400593831668
},
"harness|arc:challenge|25": {
"acc": 0.7098976109215017,
"acc_stderr": 0.013261573677520766,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869155
},
"harness|hellaswag|10": {
"acc": 0.7183827922724557,
"acc_stderr": 0.0044886843979795015,
"acc_norm": 0.8794064927305317,
"acc_norm_stderr": 0.0032498873947065044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542943,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542943
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479047,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479047
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516301,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516301
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371802,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371802
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438775,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5801713586291309,
"mc1_stderr": 0.017277030301775766,
"mc2": 0.6984571807866093,
"mc2_stderr": 0.01516400593831668
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hippocrates/MedQA_one_shot_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 2491204
num_examples: 1273
- name: valid
num_bytes: 2491204
num_examples: 1273
- name: test
num_bytes: 2491204
num_examples: 1273
download_size: 2482272
dataset_size: 7473612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Tristan/olm-wikipedia-20221220-1-percent-tokenized-766 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
splits:
- name: train
num_bytes: 300178944
num_examples: 65143
download_size: 93964466
dataset_size: 300178944
---
# Dataset Card for "olm-wikipedia-20221220-1-percent-tokenized-766"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/imai_midori_shirobako | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Imai Midori
This is the dataset of Imai Midori, containing 276 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 276 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 611 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 276 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 276 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 276 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 276 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 276 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 611 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 611 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 611 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p | ---
pretty_name: Evaluation run of Aryanne/sheared-plus-westlake-50_75p
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aryanne/sheared-plus-westlake-50_75p](https://huggingface.co/Aryanne/sheared-plus-westlake-50_75p)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-23T22:04:31.166175](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p/blob/main/results_2024-01-23T22-04-31.166175.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2672140448144015,\n\
\ \"acc_stderr\": 0.03127543112931,\n \"acc_norm\": 0.26909676356851875,\n\
\ \"acc_norm_stderr\": 0.03210076459110669,\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.42638955634632064,\n\
\ \"mc2_stderr\": 0.014788435851867392\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3310580204778157,\n \"acc_stderr\": 0.013752062419817836,\n\
\ \"acc_norm\": 0.34044368600682595,\n \"acc_norm_stderr\": 0.013847460518892983\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4441346345349532,\n\
\ \"acc_stderr\": 0.004958537988993581,\n \"acc_norm\": 0.5804620593507269,\n\
\ \"acc_norm_stderr\": 0.004924748500639335\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952924,\n\
\ \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118355,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118355\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20425531914893616,\n \"acc_stderr\": 0.026355158413349417,\n\
\ \"acc_norm\": 0.20425531914893616,\n \"acc_norm_stderr\": 0.026355158413349417\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843671,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843671\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276863,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276863\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24516129032258063,\n \"acc_stderr\": 0.02447224384089553,\n \"\
acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.02447224384089553\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114468,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.035243908445117836,\n\
\ \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.035243908445117836\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041156,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041156\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2692307692307692,\n \"acc_stderr\": 0.022489389793654824,\n\
\ \"acc_norm\": 0.2692307692307692,\n \"acc_norm_stderr\": 0.022489389793654824\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.025859164122051463,\n\
\ \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.025859164122051463\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20733944954128442,\n \"acc_stderr\": 0.017381415563608664,\n \"\
acc_norm\": 0.20733944954128442,\n \"acc_norm_stderr\": 0.017381415563608664\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664742,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664742\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.02742100729539294,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.02742100729539294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23499361430395913,\n\
\ \"acc_stderr\": 0.01516202415227844,\n \"acc_norm\": 0.23499361430395913,\n\
\ \"acc_norm_stderr\": 0.01516202415227844\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2976878612716763,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.2976878612716763,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.014465893829859926,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.014465893829859926\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.02417084087934101,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.02417084087934101\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.023016705640262192,\n\
\ \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.023016705640262192\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705474,\n \
\ \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2379400260756193,\n\
\ \"acc_stderr\": 0.010875700787694228,\n \"acc_norm\": 0.2379400260756193,\n\
\ \"acc_norm_stderr\": 0.010875700787694228\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411955,\n\
\ \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411955\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177795,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177795\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.025801283475090506,\n\
\ \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.025801283475090506\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.25870646766169153,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.25870646766169153,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n\
\ \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.42638955634632064,\n\
\ \"mc2_stderr\": 0.014788435851867392\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.569060773480663,\n \"acc_stderr\": 0.013917796623335964\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Aryanne/sheared-plus-westlake-50_75p
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|arc:challenge|25_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|gsm8k|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hellaswag|10_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-23T22-04-31.166175.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- '**/details_harness|winogrande|5_2024-01-23T22-04-31.166175.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-23T22-04-31.166175.parquet'
- config_name: results
data_files:
- split: 2024_01_23T22_04_31.166175
path:
- results_2024-01-23T22-04-31.166175.parquet
- split: latest
path:
- results_2024-01-23T22-04-31.166175.parquet
---
# Dataset Card for Evaluation run of Aryanne/sheared-plus-westlake-50_75p
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aryanne/sheared-plus-westlake-50_75p](https://huggingface.co/Aryanne/sheared-plus-westlake-50_75p) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-23T22:04:31.166175](https://huggingface.co/datasets/open-llm-leaderboard/details_Aryanne__sheared-plus-westlake-50_75p/blob/main/results_2024-01-23T22-04-31.166175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2672140448144015,
"acc_stderr": 0.03127543112931,
"acc_norm": 0.26909676356851875,
"acc_norm_stderr": 0.03210076459110669,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.42638955634632064,
"mc2_stderr": 0.014788435851867392
},
"harness|arc:challenge|25": {
"acc": 0.3310580204778157,
"acc_stderr": 0.013752062419817836,
"acc_norm": 0.34044368600682595,
"acc_norm_stderr": 0.013847460518892983
},
"harness|hellaswag|10": {
"acc": 0.4441346345349532,
"acc_stderr": 0.004958537988993581,
"acc_norm": 0.5804620593507269,
"acc_norm_stderr": 0.004924748500639335
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952924,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118355,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118355
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20425531914893616,
"acc_stderr": 0.026355158413349417,
"acc_norm": 0.20425531914893616,
"acc_norm_stderr": 0.026355158413349417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843671,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843671
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276863,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276863
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114468,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.035243908445117836,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.035243908445117836
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041156,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041156
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.022489389793654824,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.022489389793654824
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19747899159663865,
"acc_stderr": 0.025859164122051463,
"acc_norm": 0.19747899159663865,
"acc_norm_stderr": 0.025859164122051463
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20733944954128442,
"acc_stderr": 0.017381415563608664,
"acc_norm": 0.20733944954128442,
"acc_norm_stderr": 0.017381415563608664
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664742,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664742
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.02742100729539294,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.02742100729539294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23499361430395913,
"acc_stderr": 0.01516202415227844,
"acc_norm": 0.23499361430395913,
"acc_norm_stderr": 0.01516202415227844
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2976878612716763,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.2976878612716763,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.014465893829859926,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.014465893829859926
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.02417084087934101,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.02417084087934101
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705474,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2379400260756193,
"acc_stderr": 0.010875700787694228,
"acc_norm": 0.2379400260756193,
"acc_norm_stderr": 0.010875700787694228
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411955,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411955
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177795,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177795
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.025801283475090506,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.025801283475090506
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.25870646766169153,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.25870646766169153,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.42638955634632064,
"mc2_stderr": 0.014788435851867392
},
"harness|winogrande|5": {
"acc": 0.569060773480663,
"acc_stderr": 0.013917796623335964
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3 | ---
pretty_name: Evaluation run of kekmodel/StopCarbon-10.7B-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kekmodel/StopCarbon-10.7B-v3](https://huggingface.co/kekmodel/StopCarbon-10.7B-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-30T10:15:50.941228](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3/blob/main/results_2023-12-30T10-15-50.941228.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6649827029825734,\n\
\ \"acc_stderr\": 0.03166471620730208,\n \"acc_norm\": 0.6659533597079996,\n\
\ \"acc_norm_stderr\": 0.03230745700615819,\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7193506614125464,\n\
\ \"mc2_stderr\": 0.014949525122441177\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6860068259385665,\n \"acc_stderr\": 0.013562691224726293,\n\
\ \"acc_norm\": 0.7098976109215017,\n \"acc_norm_stderr\": 0.013261573677520764\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7180840470025891,\n\
\ \"acc_stderr\": 0.004490130691020433,\n \"acc_norm\": 0.8856801433977295,\n\
\ \"acc_norm_stderr\": 0.003175490413694419\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n\
\ \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n\
\ \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6212765957446809,\n\
\ \"acc_stderr\": 0.03170995606040655,\n \"acc_norm\": 0.6212765957446809,\n\
\ \"acc_norm_stderr\": 0.03170995606040655\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n \"\
acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8161290322580645,\n \"acc_stderr\": 0.022037217340267822,\n \"\
acc_norm\": 0.8161290322580645,\n \"acc_norm_stderr\": 0.022037217340267822\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857406,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857406\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.03381200005643527,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.03381200005643527\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025046,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \
\ \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4201117318435754,\n\
\ \"acc_stderr\": 0.016507671073256402,\n \"acc_norm\": 0.4201117318435754,\n\
\ \"acc_norm_stderr\": 0.016507671073256402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.02463004897982478,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.02463004897982478\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7746913580246914,\n \"acc_stderr\": 0.023246202647819753,\n\
\ \"acc_norm\": 0.7746913580246914,\n \"acc_norm_stderr\": 0.023246202647819753\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4908735332464146,\n\
\ \"acc_stderr\": 0.01276810860164001,\n \"acc_norm\": 0.4908735332464146,\n\
\ \"acc_norm_stderr\": 0.01276810860164001\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7463235294117647,\n \"acc_stderr\": 0.026431329870789527,\n\
\ \"acc_norm\": 0.7463235294117647,\n \"acc_norm_stderr\": 0.026431329870789527\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.02553843336857834,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.02553843336857834\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.572827417380661,\n\
\ \"mc1_stderr\": 0.017316834410963926,\n \"mc2\": 0.7193506614125464,\n\
\ \"mc2_stderr\": 0.014949525122441177\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166732\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6322971948445792,\n \
\ \"acc_stderr\": 0.013281630503395475\n }\n}\n```"
repo_url: https://huggingface.co/kekmodel/StopCarbon-10.7B-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|arc:challenge|25_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|gsm8k|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hellaswag|10_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-30T10-15-50.941228.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- '**/details_harness|winogrande|5_2023-12-30T10-15-50.941228.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-30T10-15-50.941228.parquet'
- config_name: results
data_files:
- split: 2023_12_30T10_15_50.941228
path:
- results_2023-12-30T10-15-50.941228.parquet
- split: latest
path:
- results_2023-12-30T10-15-50.941228.parquet
---
# Dataset Card for Evaluation run of kekmodel/StopCarbon-10.7B-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kekmodel/StopCarbon-10.7B-v3](https://huggingface.co/kekmodel/StopCarbon-10.7B-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-30T10:15:50.941228](https://huggingface.co/datasets/open-llm-leaderboard/details_kekmodel__StopCarbon-10.7B-v3/blob/main/results_2023-12-30T10-15-50.941228.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6649827029825734,
"acc_stderr": 0.03166471620730208,
"acc_norm": 0.6659533597079996,
"acc_norm_stderr": 0.03230745700615819,
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7193506614125464,
"mc2_stderr": 0.014949525122441177
},
"harness|arc:challenge|25": {
"acc": 0.6860068259385665,
"acc_stderr": 0.013562691224726293,
"acc_norm": 0.7098976109215017,
"acc_norm_stderr": 0.013261573677520764
},
"harness|hellaswag|10": {
"acc": 0.7180840470025891,
"acc_stderr": 0.004490130691020433,
"acc_norm": 0.8856801433977295,
"acc_norm_stderr": 0.003175490413694419
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6212765957446809,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.6212765957446809,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267822,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857406,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857406
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.03381200005643527,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.03381200005643527
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025046,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4201117318435754,
"acc_stderr": 0.016507671073256402,
"acc_norm": 0.4201117318435754,
"acc_norm_stderr": 0.016507671073256402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.02463004897982478,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.02463004897982478
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7746913580246914,
"acc_stderr": 0.023246202647819753,
"acc_norm": 0.7746913580246914,
"acc_norm_stderr": 0.023246202647819753
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4908735332464146,
"acc_stderr": 0.01276810860164001,
"acc_norm": 0.4908735332464146,
"acc_norm_stderr": 0.01276810860164001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7463235294117647,
"acc_stderr": 0.026431329870789527,
"acc_norm": 0.7463235294117647,
"acc_norm_stderr": 0.026431329870789527
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.02553843336857834,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.02553843336857834
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.572827417380661,
"mc1_stderr": 0.017316834410963926,
"mc2": 0.7193506614125464,
"mc2_stderr": 0.014949525122441177
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166732
},
"harness|gsm8k|5": {
"acc": 0.6322971948445792,
"acc_stderr": 0.013281630503395475
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1 | ---
pretty_name: Evaluation run of Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1](https://huggingface.co/Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T21:44:48.551629](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1/blob/main/results_2024-02-01T21-44-48.551629.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.235616909983485,\n\
\ \"acc_stderr\": 0.030096196864815804,\n \"acc_norm\": 0.23617812988980863,\n\
\ \"acc_norm_stderr\": 0.030895352600212644,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662578,\n \"mc2\": 0.49848823283731625,\n\
\ \"mc2_stderr\": 0.016449164481650215\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2090443686006826,\n \"acc_stderr\": 0.011882746987406458,\n\
\ \"acc_norm\": 0.2645051194539249,\n \"acc_norm_stderr\": 0.012889272949313366\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25632344154550885,\n\
\ \"acc_stderr\": 0.004357101984278612,\n \"acc_norm\": 0.2568213503286198,\n\
\ \"acc_norm_stderr\": 0.00435987151963954\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313141,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313141\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.026341480371118376,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.026341480371118376\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.03063114553919882,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.03063114553919882\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n\
\ \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n\
\ \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n\
\ \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790482,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790482\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n\
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.026265024608275882,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.026265024608275882\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.032162984205936135,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.032162984205936135\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1889908256880734,\n \"acc_stderr\": 0.016785481159203634,\n \"\
acc_norm\": 0.1889908256880734,\n \"acc_norm_stderr\": 0.016785481159203634\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.14814814814814814,\n \"acc_stderr\": 0.024227629273728356,\n \"\
acc_norm\": 0.14814814814814814,\n \"acc_norm_stderr\": 0.024227629273728356\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24393358876117496,\n\
\ \"acc_stderr\": 0.015357212665829489,\n \"acc_norm\": 0.24393358876117496,\n\
\ \"acc_norm_stderr\": 0.015357212665829489\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.02841820861940679,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.02841820861940679\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.034605799075530276,\n\
\ \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.034605799075530276\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.32748538011695905,\n\
\ \"acc_stderr\": 0.035993357714560276,\n \"acc_norm\": 0.32748538011695905,\n\
\ \"acc_norm_stderr\": 0.035993357714560276\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.015077219200662578,\n\
\ \"mc2\": 0.49848823283731625,\n \"mc2_stderr\": 0.016449164481650215\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.4940805051302289,\n\
\ \"acc_stderr\": 0.014051500838485807\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|arc:challenge|25_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|gsm8k|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hellaswag|10_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T21-44-48.551629.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- '**/details_harness|winogrande|5_2024-02-01T21-44-48.551629.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T21-44-48.551629.parquet'
- config_name: results
data_files:
- split: 2024_02_01T21_44_48.551629
path:
- results_2024-02-01T21-44-48.551629.parquet
- split: latest
path:
- results_2024-02-01T21-44-48.551629.parquet
---
# Dataset Card for Evaluation run of Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1](https://huggingface.co/Dans-DiscountModels/TinyMistral-v2.5-MiniPile-Guidelines-E1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T21:44:48.551629](https://huggingface.co/datasets/open-llm-leaderboard/details_Dans-DiscountModels__TinyMistral-v2.5-MiniPile-Guidelines-E1/blob/main/results_2024-02-01T21-44-48.551629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.235616909983485,
"acc_stderr": 0.030096196864815804,
"acc_norm": 0.23617812988980863,
"acc_norm_stderr": 0.030895352600212644,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662578,
"mc2": 0.49848823283731625,
"mc2_stderr": 0.016449164481650215
},
"harness|arc:challenge|25": {
"acc": 0.2090443686006826,
"acc_stderr": 0.011882746987406458,
"acc_norm": 0.2645051194539249,
"acc_norm_stderr": 0.012889272949313366
},
"harness|hellaswag|10": {
"acc": 0.25632344154550885,
"acc_stderr": 0.004357101984278612,
"acc_norm": 0.2568213503286198,
"acc_norm_stderr": 0.00435987151963954
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313141,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313141
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.026341480371118376,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.026341480371118376
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.03063114553919882,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.03063114553919882
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.2,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.026265024608275882,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.026265024608275882
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.032162984205936135,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.032162984205936135
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1889908256880734,
"acc_stderr": 0.016785481159203634,
"acc_norm": 0.1889908256880734,
"acc_norm_stderr": 0.016785481159203634
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.14814814814814814,
"acc_stderr": 0.024227629273728356,
"acc_norm": 0.14814814814814814,
"acc_norm_stderr": 0.024227629273728356
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24393358876117496,
"acc_stderr": 0.015357212665829489,
"acc_norm": 0.24393358876117496,
"acc_norm_stderr": 0.015357212665829489
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.02841820861940679,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.02841820861940679
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662578,
"mc2": 0.49848823283731625,
"mc2_stderr": 0.016449164481650215
},
"harness|winogrande|5": {
"acc": 0.4940805051302289,
"acc_stderr": 0.014051500838485807
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
senhorsapo/yor | ---
license: openrail
---
|
ibm/Wish-QA-ASQA-Falcon | ---
dataset_info:
features:
- name: id
dtype: string
- name: old_question
dtype: string
- name: old_answer
dtype: string
- name: passage_1
dtype: string
- name: passage_2
dtype: string
- name: passage_3
dtype: string
- name: text
dtype: string
- name: qa
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: doc_score
dtype: float64
- name: score_qa
dtype: float64
- name: ans_num_words
dtype: int64
- name: text_num_words
dtype: int64
- name: text_longer_1.5
dtype: int64
- name: input
dtype: string
- name: output 0 answer
dtype: string
splits:
- name: train
num_bytes: 23433520
num_examples: 4354
download_size: 14082055
dataset_size: 23433520
---
# Dataset Card for "Wish-QA-ASQA-Falcon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yekta/banchan | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 6899876.0
num_examples: 13
download_size: 6901557
dataset_size: 6899876.0
---
# Dataset Card for "banchan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Deojoandco/dialogturns_not_generated_train | ---
dataset_info:
features:
- name: url
dtype: string
- name: id
dtype: string
- name: num_comments
dtype: int64
- name: name
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: score
dtype: int64
- name: upvote_ratio
dtype: float64
- name: distinguished
dtype: string
- name: over_18
dtype: bool
- name: created_utc
dtype: int64
- name: comments
list:
- name: body
dtype: string
- name: created_utc
dtype: float64
- name: distinguished
dtype: string
- name: id
dtype: string
- name: permalink
dtype: string
- name: score
dtype: int64
- name: best_num_comments
dtype: int64
- name: query
dtype: string
- name: dialog
dtype: string
- name: annotation_success
dtype: bool
- name: annotation_text
dtype: string
- name: turns_generated
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 7511834
num_examples: 284
download_size: 4057449
dataset_size: 7511834
---
# Dataset Card for "dialogturns_not_generated_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Handwriting_OCR_Data_of_Japanese_and_Korean | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Handwriting_OCR_Data_of_Japanese_and_Korean
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/127?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
100 People - Handwriting OCR Data of Japanese and Korean,. This dadaset was collected from 100 subjects including 50 Japanese, 49 Koreans and 1 Afghan. For different subjects, the corpus are different. The data diversity includes multiple cellphone models and different corpus. This dataset can be used for tasks, such as handwriting OCR data of Japanese and Korean.
For more details, please refer to the link: https://www.nexdata.ai/datasets/127?source=Huggingface
### Supported Tasks and Leaderboards
image-to-text, computer-vision: The dataset can be used to train a model for image-to-text.
### Languages
Japanese, Korean
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
arieg/bw_spec_cls_80_11 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '27855'
'1': '27856'
'2': '27866'
'3': '27945'
'4': '27953'
'5': '27975'
'6': '27978'
'7': '27981'
'8': '27987'
'9': '28241'
'10': '28260'
'11': '28477'
'12': '28478'
'13': '28479'
'14': '28480'
'15': '28481'
'16': '28482'
'17': '28483'
'18': '28484'
'19': '28485'
'20': '28546'
'21': '28548'
'22': '28553'
'23': '28571'
'24': '28608'
'25': '29045'
'26': '29128'
'27': '29180'
'28': '29243'
'29': '29245'
'30': '29255'
'31': '29271'
'32': '29272'
'33': '29355'
'34': '29465'
'35': '29480'
'36': '29587'
'37': '29602'
'38': '29673'
'39': '29718'
'40': '29719'
'41': '29720'
'42': '29721'
'43': '29738'
'44': '29739'
'45': '29740'
'46': '29741'
'47': '29742'
'48': '29744'
'49': '29745'
'50': '29746'
'51': '29747'
'52': '29750'
'53': '29752'
'54': '29807'
'55': '29813'
'56': '29816'
'57': '29961'
'58': '29971'
'59': '30041'
'60': '30043'
'61': '30050'
'62': '30056'
'63': '30058'
'64': '30059'
'65': '30090'
'66': '30095'
'67': '30120'
'68': '30196'
'69': '30198'
'70': '30230'
'71': '30486'
'72': '30487'
'73': '30488'
'74': '30519'
'75': '30520'
'76': '30521'
'77': '30522'
'78': '30636'
'79': '30690'
splits:
- name: train
num_bytes: 89109867.2
num_examples: 1600
download_size: 88188426
dataset_size: 89109867.2
---
# Dataset Card for "bw_spec_cls_80_11"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-wmt14-de-en-fbedb0-67643145604 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- wmt14
eval_info:
task: translation
model: leukas/byt5-large-wmt14-deen
metrics: ['bleu']
dataset_name: wmt14
dataset_config: de-en
dataset_split: test
col_mapping:
source: translation.de
target: translation.en
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Translation
* Model: leukas/byt5-large-wmt14-deen
* Dataset: wmt14
* Config: de-en
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@seeed](https://huggingface.co/seeed) for evaluating this model. |
Carlisle/msmacro-passage-non-abs-small | ---
license: mit
---
|
vgaraujov/fapesp | ---
language:
- en
- es
- pt
license:
- cc-by-2.0
task_categories:
- translation
dataset_info:
- config_name: en-pt
features:
- name: translation
dtype:
translation:
languages:
- en
- pt
splits:
- name: train
num_bytes: 47417503
num_examples: 160975
- name: validation
num_bytes: 405055
num_examples: 1375
- name: test
num_bytes: 407579
num_examples: 1447
download_size: 29615550
dataset_size: 48230137
- config_name: es-pt
features:
- name: translation
dtype:
translation:
languages:
- es
- pt
splits:
- name: train
num_bytes: 47480897
num_examples: 158197
- name: validation
num_bytes: 377101
num_examples: 1302
- name: test
num_bytes: 400915
num_examples: 1379
download_size: 29829573
dataset_size: 48258913
configs:
- config_name: en-pt
data_files:
- split: train
path: en-pt/train-*
- split: validation
path: en-pt/validation-*
- split: test
path: en-pt/test-*
- config_name: es-pt
data_files:
- split: train
path: es-pt/train-*
- split: validation
path: es-pt/validation-*
- split: test
path: es-pt/test-*
--- |
ibranze/araproje_arc_tr_f2 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 86423.0
num_examples: 250
download_size: 46973
dataset_size: 86423.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_tr_f2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ywan111/macbook-dataset-b4 | ---
license: apache-2.0
---
|
MLNavigator/russian-retrieval | ---
license: mit
---
Based on Sberquad
- Answer converted to human affordable answer.
- Context augmented with some pices of texts from wiki accordant to text on tematic and keywords.
- This dataset cold be used for training retrieval LLM models or modificators for ability of LLM to retrieve target information from collection of tematic related texts.
- Dataset has version with SOURCE data for generating answer with specifing source document for right answer. See file retrieval_dataset_src.jsonl
Dataset consists of 45278 examples in russian language of format:
{
'text': 'text with correct answer',
'q': 'question text',
'a': 'correct answer text',
'context': 'text of 4-10 text chunks, one with right answer and others relevant with text and question on tematic and keywords'
}
Length of one example of context + question + answer is less than 7000 symbols. It should be less than 2048 tokens of rugpt tokenizer.
File retrieval_dataset_src.jsonl has additionally SOURCE data for every text chunk in context, also SOURCE of right answer is set in answer.
This variant of dataset is useful if you need extract answer with specifing source of the right answer.
{
'text': 'text with correct answer',
'q': 'question text',
'a': 'correct answer text with SOURCE data of text',
'context': 'text of 4-10 text chunks, one with right answer and others relevant with text and question on tematic and keywords.
Each of text chunks has it's own SOURCE data'
}
All SOURCE data are sintetic generated and not real. |
taesiri/imagenet_hard_review_data | ---
license: mit
---
|
arthurmluz/GPTextSum_data-wiki_1024_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 25941
num_examples: 20
download_size: 32992
dataset_size: 25941
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum_data-wiki_1024_results"
rouge= {'rouge1': 0.20436494957206813, 'rouge2': 0.06669792477248418, 'rougeL': 0.1645584797463879, 'rougeLsum': 0.1645584797463879}
bert= {'precision': 0.7313757807016372, 'recall': 0.6589481264352799, 'f1': 0.6928485721349716} |
Jessiecs/llama-2-7b-a3-self-curated | ---
dataset_info:
features:
- name: instruction_generated
dtype: string
- name: response
dtype: string
- name: rating_score
dtype: string
- name: is_high_quality
dtype: bool
splits:
- name: train
num_bytes: 231166
num_examples: 128
download_size: 152410
dataset_size: 231166
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gianma/eurlexsum_ita_cleaned_8192_232 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: is_camera
dtype: bool
- name: reference
dtype: string
- name: summary
dtype: string
- name: tokenized_len_total
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4119487
num_examples: 228
- name: validation
num_bytes: 231666
num_examples: 13
- name: test
num_bytes: 253451
num_examples: 13
download_size: 0
dataset_size: 4604604
---
# Dataset Card for "eurlexsum_ita_cleaned_8192_232"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathanli/winston-ai-luka-dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: model
dtype: string
splits:
- name: train
num_bytes: 29522548
num_examples: 10000
download_size: 16797973
dataset_size: 29522548
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sherelyn912/finnews_en_2wk_qa | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 310555.2191464821
num_examples: 1387
- name: test
num_bytes: 77694.78085351788
num_examples: 347
download_size: 182095
dataset_size: 388250.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
extracted content from various website articles using code from [FinNLP](https://github.com/AI4Finance-Foundation/FinNLP)
and generated questions and answers with OpenAI API **gpt3.5-turbo-16k** |
David-Egea/Creditcard-fraud-detection | ---
license: mit
---
# Credit Card Fraud Detection
This dataset was downloaded from https://www.kaggle.com/datasets/mlg-ulb/creditcardfraud/data adn uploaded for educational purposes.
|
CronosGhost/wikipedia_fr_snippets | ---
license: mit
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 8126467952
num_examples: 12138717
download_size: 4271960527
dataset_size: 8126467952
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mwong/climate-evidence-related | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-3.0
- gpl-3.0
multilinguality:
- monolingual
paperswithcode_id: climate-fever
pretty_name: climate-fever
size_categories:
- 100K<n<1M
source_datasets:
- extended|climate_fever
task_categories:
- text-classification
task_ids:
- fact-checking
---
### Dataset Summary
This dataset is extracted from Climate Fever dataset (https://www.sustainablefinance.uzh.ch/en/research/climate-fever.html), pre-processed and ready to train and evaluate.
The training objective is a text classification task - given a claim and evidence, predict if evidence is related to claim. |
gagan3012/FSR | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: answer
dtype: string
- name: query
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: test
num_bytes: 5874352
num_examples: 3931
download_size: 1688819
dataset_size: 5874352
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
liuyanchen1015/MULTI_VALUE_mrpc_a_ing | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 194382
num_examples: 670
- name: train
num_bytes: 412800
num_examples: 1423
- name: validation
num_bytes: 45079
num_examples: 150
download_size: 437103
dataset_size: 652261
---
# Dataset Card for "MULTI_VALUE_mrpc_a_ing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
automated-research-group/llama2_7b_chat-agieval-results | ---
dataset_info:
config_name: '{''do_sample''=False, ''beams''=1}'
features:
- name: id
dtype: string
- name: prediction
dtype: string
- name: agieval_accuracy
dtype: bool
splits:
- name: train
num_bytes: 85792
num_examples: 254
download_size: 46709
dataset_size: 85792
configs:
- config_name: '{''do_sample''=False, ''beams''=1}'
data_files:
- split: train
path: '{''do_sample''=False, ''beams''=1}/train-*'
---
|
coref-data/niv2_winogrande_raw | ---
license: apache-2.0
---
# Natural Instructions v2 Winogrande Tasks
- Project: https://github.com/allenai/natural-instructions
- Data source: [DataProvenanceInitiative/niv2_submix_original](https://huggingface.co/datasets/DataProvenanceInitiative/niv2_submix_original)
## Details
This dataset contains all Winogrande examples that were included in the [Flan 2022 collection](https://github.com/google-research/FLAN/tree/main/flan/v2) which were orignally published in Super-Natural-Instructions.
The data is copied from the preprocessed Natural Instructions v2 dataset at [DataProvenanceInitiative/niv2_submix_original](https://huggingface.co/datasets/DataProvenanceInitiative/niv2_submix_original).
These tasks are:
1. 'task029_winogrande_full_object': Creating a pair of fill in the blank question-answer pairs on objects.
2. 'task030_winogrande_full_person': Creating a pair of fill in the blank questions on persons.
3. 'task031_winogrande_question_generation_object': Writing a fill in the blank question on objects.
4. 'task032_winogrande_question_generation_person': Writing a fill in the blank question on persons.
5. 'task033_winogrande_answer_generation': Answering a fill in the blank question on objects.
6. 'task034_winogrande_question_modification_object': Modifying a fill in the blank question on objects.
7. 'task035_winogrande_question_modification_person': Modifying a fill in the blank question on persons.
8. 'task1391_winogrande_easy_answer_generation': Answering a fill in the blank question on objects.
### Fields
- `inputs`: a `string` feature.
- `targets`: a `string` feature.
- `task_source`: a `string` feature.
- `task_name`: a `string` feature.
- `template_type`: a `string` feature.
## Citation
```
@inproceedings{wang-etal-2022-super,
title = "Super-{N}atural{I}nstructions: Generalization via Declarative Instructions on 1600+ {NLP} Tasks",
author = "Wang, Yizhong and
Mishra, Swaroop and
Alipoormolabashi, Pegah and
Kordi, Yeganeh and
Mirzaei, Amirreza and
Naik, Atharva and
Ashok, Arjun and
Dhanasekaran, Arut Selvan and
Arunkumar, Anjana and
Stap, David and
Pathak, Eshaan and
Karamanolakis, Giannis and
Lai, Haizhi and
Purohit, Ishan and
Mondal, Ishani and
Anderson, Jacob and
Kuznia, Kirby and
Doshi, Krima and
Pal, Kuntal Kumar and
Patel, Maitreya and
Moradshahi, Mehrad and
Parmar, Mihir and
Purohit, Mirali and
Varshney, Neeraj and
Kaza, Phani Rohitha and
Verma, Pulkit and
Puri, Ravsehaj Singh and
Karia, Rushang and
Doshi, Savan and
Sampat, Shailaja Keyur and
Mishra, Siddhartha and
Reddy A, Sujan and
Patro, Sumanta and
Dixit, Tanay and
Shen, Xudong",
editor = "Goldberg, Yoav and
Kozareva, Zornitsa and
Zhang, Yue",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.340",
doi = "10.18653/v1/2022.emnlp-main.340",
pages = "5085--5109",
abstract = "How well can NLP models generalize to a variety of unseen tasks when provided with task instructions? To address this question, we first introduce Super-NaturalInstructions, a benchmark of 1,616 diverse NLP tasks and their expert-written instructions. Our collection covers 76 distinct task types, including but not limited to classification, extraction, infilling, sequence tagging, text rewriting, and text composition. This large and diverse collection of tasks enables rigorous benchmarking of cross-task generalization under instructions{---}training models to follow instructions on a subset of tasks and evaluating them on the remaining unseen ones. Furthermore, we build Tk-Instruct, a transformer model trained to follow a variety of in-context instructions (plain language task definitions or k-shot examples). Our experiments show that Tk-Instruct outperforms existing instruction-following models such as InstructGPT by over 9{\%} on our benchmark despite being an order of magnitude smaller. We further analyze generalization as a function of various scaling parameters, such as the number of observed tasks, the number of instances per task, and model sizes. We hope our dataset and model facilitate future progress towards more general-purpose NLP models.",
}
``` |
Zarakun/audiobooks_ua_test | ---
tags:
- "audio"
configs:
- config_name: default
data_files:
- split: train
path: "data/train.parquet"
---
### About dataset
It is a dataset of ukrainian audiobooks
Each sample contain an approximately 8 seconds od ukrainian speech
### Loading script
```
>>> load_dataset("Zarakun/audiobooks_ua_test")
```
### Dataset structure
**Every example has the following:
**audio** - the waveform
**rate** - the sampling rate of the waveform
**file_id** - the id of the speaker
**duration** - the duration of the video in seconds
**sentence** - the transcript of the video
|
Shivanshyadav/wrinkled_to_ironed_clothes | ---
license: apache-2.0
dataset_info:
features:
- name: input_image
dtype: image
- name: instruct_prompt
dtype: string
- name: output_image
dtype: image
splits:
- name: train
num_bytes: 20133465629.31
num_examples: 3022
download_size: 19529267242
dataset_size: 20133465629.31
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
neerajaabhyankar/hindustani-raag-small | ---
license: cc-by-4.0
task_categories:
- audio-classification
tags:
- music
- hindustani
- raag
- raga
- raaga
pretty_name: Hindustani Raag Identification (Small)
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: train
path: '**/train_*.mp3'
- split: test
path: '**/test_*.mp3'
--- |
open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat | ---
pretty_name: Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kyujinpy/PlatYi-34B-Llama-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-10T05:55:07.023442](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat/blob/main/results_2023-12-10T05-55-07.023442.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7741514926490987,\n\
\ \"acc_stderr\": 0.027646135380835733,\n \"acc_norm\": 0.7828326159595959,\n\
\ \"acc_norm_stderr\": 0.02814394317924737,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5362104216200869,\n\
\ \"mc2_stderr\": 0.01504184962981019\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6313993174061433,\n \"acc_stderr\": 0.014097810678042194,\n\
\ \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6533559051981677,\n\
\ \"acc_stderr\": 0.004749286071559569,\n \"acc_norm\": 0.8525194184425413,\n\
\ \"acc_norm_stderr\": 0.003538596773704832\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-anatomy|5\"\
: {\n \"acc\": 0.7555555555555555,\n \"acc_stderr\": 0.03712537833614866,\n\
\ \"acc_norm\": 0.7555555555555555,\n \"acc_norm_stderr\": 0.03712537833614866\n\
\ },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n\
\ \"acc_stderr\": 0.026293995855474938,\n \"acc_norm\": 0.881578947368421,\n\
\ \"acc_norm_stderr\": 0.026293995855474938\n },\n \"harness|hendrycksTest-business_ethics|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\":\
\ 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372277,\n \"\
acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372277\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9027777777777778,\n\
\ \"acc_stderr\": 0.024774516250440182,\n \"acc_norm\": 0.9027777777777778,\n\
\ \"acc_norm_stderr\": 0.024774516250440182\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7872340425531915,\n \"acc_stderr\": 0.02675439134803976,\n\
\ \"acc_norm\": 0.7872340425531915,\n \"acc_norm_stderr\": 0.02675439134803976\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.753968253968254,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.753968253968254,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n\
\ \"acc_stderr\": 0.043758884927270585,\n \"acc_norm\": 0.6031746031746031,\n\
\ \"acc_norm_stderr\": 0.043758884927270585\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9258064516129032,\n\
\ \"acc_stderr\": 0.01490952930054621,\n \"acc_norm\": 0.9258064516129032,\n\
\ \"acc_norm_stderr\": 0.01490952930054621\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6847290640394089,\n \"acc_stderr\": 0.03269080871970186,\n\
\ \"acc_norm\": 0.6847290640394089,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\"\
: 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n\
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.0188526702349931,\n \"acc_norm\"\
: 0.9242424242424242,\n \"acc_norm_stderr\": 0.0188526702349931\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n\
\ \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.823076923076923,\n \"acc_stderr\": 0.01934807017439698,\n \
\ \"acc_norm\": 0.823076923076923,\n \"acc_norm_stderr\": 0.01934807017439698\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4925925925925926,\n \"acc_stderr\": 0.0304821923951915,\n \
\ \"acc_norm\": 0.4925925925925926,\n \"acc_norm_stderr\": 0.0304821923951915\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8697478991596639,\n \"acc_stderr\": 0.02186325849485212,\n \
\ \"acc_norm\": 0.8697478991596639,\n \"acc_norm_stderr\": 0.02186325849485212\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5496688741721855,\n \"acc_stderr\": 0.04062290018683775,\n \"\
acc_norm\": 0.5496688741721855,\n \"acc_norm_stderr\": 0.04062290018683775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9302752293577982,\n \"acc_stderr\": 0.010919426411848607,\n \"\
acc_norm\": 0.9302752293577982,\n \"acc_norm_stderr\": 0.010919426411848607\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.0305467452649532,\n \"acc_norm\"\
: 0.7222222222222222,\n \"acc_norm_stderr\": 0.0305467452649532\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n\
\ \"acc_stderr\": 0.018869514646658935,\n \"acc_norm\": 0.9215686274509803,\n\
\ \"acc_norm_stderr\": 0.018869514646658935\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.919831223628692,\n \"acc_stderr\": 0.017676679991891632,\n\
\ \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.017676679991891632\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n\
\ \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n\
\ \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9338842975206612,\n \"acc_stderr\": 0.022683403691723312,\n \"\
acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.022683403691723312\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n\
\ \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.031766839486404054,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.031766839486404054\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n\
\ \"acc_stderr\": 0.014450181176872736,\n \"acc_norm\": 0.9487179487179487,\n\
\ \"acc_norm_stderr\": 0.014450181176872736\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9080459770114943,\n\
\ \"acc_stderr\": 0.010333225570778518,\n \"acc_norm\": 0.9080459770114943,\n\
\ \"acc_norm_stderr\": 0.010333225570778518\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442265,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442265\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.788826815642458,\n\
\ \"acc_stderr\": 0.013650276794312199,\n \"acc_norm\": 0.788826815642458,\n\
\ \"acc_norm_stderr\": 0.013650276794312199\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8660130718954249,\n \"acc_stderr\": 0.019504890618464815,\n\
\ \"acc_norm\": 0.8660130718954249,\n \"acc_norm_stderr\": 0.019504890618464815\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8456591639871383,\n\
\ \"acc_stderr\": 0.020519050342084726,\n \"acc_norm\": 0.8456591639871383,\n\
\ \"acc_norm_stderr\": 0.020519050342084726\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8580246913580247,\n \"acc_stderr\": 0.019420260109438293,\n\
\ \"acc_norm\": 0.8580246913580247,\n \"acc_norm_stderr\": 0.019420260109438293\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6631205673758865,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.6631205673758865,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6186440677966102,\n\
\ \"acc_stderr\": 0.01240550940188812,\n \"acc_norm\": 0.6186440677966102,\n\
\ \"acc_norm_stderr\": 0.01240550940188812\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8272058823529411,\n \"acc_stderr\": 0.022966067585581767,\n\
\ \"acc_norm\": 0.8272058823529411,\n \"acc_norm_stderr\": 0.022966067585581767\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8316993464052288,\n \"acc_stderr\": 0.01513580333869338,\n \
\ \"acc_norm\": 0.8316993464052288,\n \"acc_norm_stderr\": 0.01513580333869338\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098615,\n\
\ \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098615\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n\
\ \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5362104216200869,\n\
\ \"mc2_stderr\": 0.01504184962981019\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855944\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.44351781652767247,\n \
\ \"acc_stderr\": 0.013684327592606165\n }\n}\n```"
repo_url: https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|arc:challenge|25_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|gsm8k|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hellaswag|10_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-10T05-55-07.023442.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- '**/details_harness|winogrande|5_2023-12-10T05-55-07.023442.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-10T05-55-07.023442.parquet'
- config_name: results
data_files:
- split: 2023_12_10T05_55_07.023442
path:
- results_2023-12-10T05-55-07.023442.parquet
- split: latest
path:
- results_2023-12-10T05-55-07.023442.parquet
---
# Dataset Card for Evaluation run of kyujinpy/PlatYi-34B-Llama-Q-FastChat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kyujinpy/PlatYi-34B-Llama-Q-FastChat](https://huggingface.co/kyujinpy/PlatYi-34B-Llama-Q-FastChat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-10T05:55:07.023442](https://huggingface.co/datasets/open-llm-leaderboard/details_kyujinpy__PlatYi-34B-Llama-Q-FastChat/blob/main/results_2023-12-10T05-55-07.023442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7741514926490987,
"acc_stderr": 0.027646135380835733,
"acc_norm": 0.7828326159595959,
"acc_norm_stderr": 0.02814394317924737,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5362104216200869,
"mc2_stderr": 0.01504184962981019
},
"harness|arc:challenge|25": {
"acc": 0.6313993174061433,
"acc_stderr": 0.014097810678042194,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6533559051981677,
"acc_stderr": 0.004749286071559569,
"acc_norm": 0.8525194184425413,
"acc_norm_stderr": 0.003538596773704832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7555555555555555,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.7555555555555555,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474938,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474938
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9027777777777778,
"acc_stderr": 0.024774516250440182,
"acc_norm": 0.9027777777777778,
"acc_norm_stderr": 0.024774516250440182
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7872340425531915,
"acc_stderr": 0.02675439134803976,
"acc_norm": 0.7872340425531915,
"acc_norm_stderr": 0.02675439134803976
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.753968253968254,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.753968253968254,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.043758884927270585,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.043758884927270585
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9258064516129032,
"acc_stderr": 0.01490952930054621,
"acc_norm": 0.9258064516129032,
"acc_norm_stderr": 0.01490952930054621
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6847290640394089,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.6847290640394089,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.0188526702349931,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.0188526702349931
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.823076923076923,
"acc_stderr": 0.01934807017439698,
"acc_norm": 0.823076923076923,
"acc_norm_stderr": 0.01934807017439698
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4925925925925926,
"acc_stderr": 0.0304821923951915,
"acc_norm": 0.4925925925925926,
"acc_norm_stderr": 0.0304821923951915
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8697478991596639,
"acc_stderr": 0.02186325849485212,
"acc_norm": 0.8697478991596639,
"acc_norm_stderr": 0.02186325849485212
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5496688741721855,
"acc_stderr": 0.04062290018683775,
"acc_norm": 0.5496688741721855,
"acc_norm_stderr": 0.04062290018683775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9302752293577982,
"acc_stderr": 0.010919426411848607,
"acc_norm": 0.9302752293577982,
"acc_norm_stderr": 0.010919426411848607
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0305467452649532,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0305467452649532
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658935,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.017676679991891632,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.017676679991891632
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9338842975206612,
"acc_stderr": 0.022683403691723312,
"acc_norm": 0.9338842975206612,
"acc_norm_stderr": 0.022683403691723312
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.031766839486404054,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.031766839486404054
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872736,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872736
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9080459770114943,
"acc_stderr": 0.010333225570778518,
"acc_norm": 0.9080459770114943,
"acc_norm_stderr": 0.010333225570778518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442265,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442265
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.788826815642458,
"acc_stderr": 0.013650276794312199,
"acc_norm": 0.788826815642458,
"acc_norm_stderr": 0.013650276794312199
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8660130718954249,
"acc_stderr": 0.019504890618464815,
"acc_norm": 0.8660130718954249,
"acc_norm_stderr": 0.019504890618464815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8456591639871383,
"acc_stderr": 0.020519050342084726,
"acc_norm": 0.8456591639871383,
"acc_norm_stderr": 0.020519050342084726
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8580246913580247,
"acc_stderr": 0.019420260109438293,
"acc_norm": 0.8580246913580247,
"acc_norm_stderr": 0.019420260109438293
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6631205673758865,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.6631205673758865,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6186440677966102,
"acc_stderr": 0.01240550940188812,
"acc_norm": 0.6186440677966102,
"acc_norm_stderr": 0.01240550940188812
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8272058823529411,
"acc_stderr": 0.022966067585581767,
"acc_norm": 0.8272058823529411,
"acc_norm_stderr": 0.022966067585581767
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8316993464052288,
"acc_stderr": 0.01513580333869338,
"acc_norm": 0.8316993464052288,
"acc_norm_stderr": 0.01513580333869338
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.023661699177098615,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.023661699177098615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5362104216200869,
"mc2_stderr": 0.01504184962981019
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855944
},
"harness|gsm8k|5": {
"acc": 0.44351781652767247,
"acc_stderr": 0.013684327592606165
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
matlok/python-copilot-training-on-ai-research-repos | ---
license:
- other
pretty_name: >-
python copilot ai research coding dataset
dataset_info:
- config_name: view_schema
splits:
- name: view_schema
configs:
- config_name: view_schema
data_files:
- split: view_schema
path: files/lok-python-code-ai-core-v1_00000002.parquet
size_categories:
- 100K<n<1M
tags:
- python-copilot
- python-coding
- fine-tuning
- training
- alpaca
- text
- coding
# supported task_categories
# text-classification, token-classification, table-question-answering, question-answering, zero-shot-classification, translation, summarization, conversational, feature-extraction, text-generation, text2text-generation, fill-mask, sentence-similarity, text-to-speech, text-to-audio, automatic-speech-recognition, audio-to-audio, audio-classification, voice-activity-detection, depth-estimation, image-classification, object-detection, image-segmentation, text-to-image, image-to-text, image-to-image, image-to-video, unconditional-image-generation, video-classification, reinforcement-learning, robotics, tabular-classification, tabular-regression, tabular-to-text, table-to-text, multiple-choice, text-retrieval, time-series-forecasting, text-to-video, visual-question-answering, document-question-answering, zero-shot-image-classification, graph-ml, mask-generation, zero-shot-object-detection, text-to-3d, image-to-3d, other
task_categories:
- text-generation
# supported task_ids
# acceptability-classification, entity-linking-classification, fact-checking, intent-classification, language-identification, multi-class-classification, multi-label-classification, multi-input-text-classification, natural-language-inference, semantic-similarity-classification, sentiment-classification, topic-classification, semantic-similarity-scoring, sentiment-scoring, sentiment-analysis, hate-speech-detection, text-scoring, named-entity-recognition, part-of-speech, parsing, lemmatization, word-sense-disambiguation, coreference-resolution, extractive-qa, open-domain-qa, closed-domain-qa, news-articles-summarization, news-articles-headline-generation, dialogue-generation, dialogue-modeling, language-modeling, text-simplification, explanation-generation, abstractive-qa, open-domain-abstractive-qa, closed-domain-qa, open-book-qa, closed-book-qa, slot-filling, masked-language-modeling, keyword-spotting, speaker-identification, audio-intent-classification, audio-emotion-recognition, audio-language-identification, multi-label-image-classification, multi-class-image-classification, face-detection, vehicle-detection, instance-segmentation, semantic-segmentation, panoptic-segmentation, image-captioning, image-inpainting, image-colorization, super-resolution, grasping, task-planning, tabular-multi-class-classification, tabular-multi-label-classification, tabular-single-column-regression, rdf-to-text, multiple-choice-qa, multiple-choice-coreference-resolution, document-retrieval, utterance-retrieval, entity-linking-retrieval, fact-checking-retrieval, univariate-time-series-forecasting, multivariate-time-series-forecasting, visual-question-answering, document-question-answering
task_ids:
- parsing
---
## Python Copilot AI Research Coding Dataset
This dataset is a subset of the matlok python copilot datasets. Please refer to the [Multimodal Python Copilot Training Overview](https://huggingface.co/datasets/matlok/multimodal-python-copilot-training-overview) for more details on how to use this dataset.
### Details
Each row contains python code, either a class method or a global function, imported modules, base classes (if any), exceptions (ordered based off the code), returns (ordered based off the code), arguments (ordered based off the code), and more.
- Rows: 514430
- Size: 674 MB
- Data type: text
- Format: Extracted code using python AST
### Schema
```json
{
"args": "string",
"class_bases": "string",
"class_docstr": "string",
"class_docstr_tok": "string",
"class_name": "string",
"code": "string",
"code_tok": "string",
"docstr": "string",
"docstr_tok": "string",
"file_path": "string",
"filename": "string",
"imports": "string",
"is_member": "bool",
"label_desc": "string",
"label_desc_len": "int64",
"label_id": "string",
"lend": "int64",
"lstart": "int64",
"name": "string",
"num_all_bases": "float64",
"num_bases": "float64",
"num_classes": "float64",
"num_functions": "int64",
"num_imports": "int64",
"num_methods": "float64",
"raises": "string",
"returns": "string",
"total_objects": "int64"
}
```
### How to use the dataset
```python
from datasets import load_dataset
ds = load_dataset("matlok/python-copilot-training-on-ai-research-repos", data_dir="files")
```
|
Falah/real_military_machinery_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 39077069
num_examples: 100000
download_size: 4335632
dataset_size: 39077069
---
# Dataset Card for "real_military_machinery_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umm-maybe/ai_images | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': train_dataset
- name: text
dtype: string
splits:
- name: train
num_bytes: 540439882.0
num_examples: 304
download_size: 540208895
dataset_size: 540439882.0
---
# Dataset Card for "ai_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/uomi_chihiro_seitokaiyakuindomo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Uomi Chihiro (Seitokai Yakuindomo)
This is the dataset of Uomi Chihiro (Seitokai Yakuindomo), containing 208 images and their tags.
The core tags of this character are `black_hair, twintails, long_hair, low_twintails, black_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 208 | 93.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uomi_chihiro_seitokaiyakuindomo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 208 | 80.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uomi_chihiro_seitokaiyakuindomo/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 421 | 158.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uomi_chihiro_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 208 | 93.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uomi_chihiro_seitokaiyakuindomo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 421 | 177.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/uomi_chihiro_seitokaiyakuindomo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/uomi_chihiro_seitokaiyakuindomo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, school_uniform, solo, upper_body, cardigan, purple_eyes, white_shirt, smile, closed_mouth, red_necktie, collared_shirt, jacket, bangs |
| 1 | 21 |  |  |  |  |  | 1girl, necktie, school_uniform, solo, cardigan, skirt |
| 2 | 5 |  |  |  |  |  | 1girl, profile, school_uniform, solo, collared_shirt, from_side, blazer, closed_mouth, outdoors, red_necktie, upper_body, white_shirt, bangs, blue_eyes, blurry_background, depth_of_field |
| 3 | 13 |  |  |  |  |  | 1girl, solo, twin_braids, anime_coloring, hair_over_shoulder, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | school_uniform | solo | upper_body | cardigan | purple_eyes | white_shirt | smile | closed_mouth | red_necktie | collared_shirt | jacket | bangs | necktie | skirt | profile | from_side | blazer | outdoors | blue_eyes | blurry_background | depth_of_field | twin_braids | anime_coloring | hair_over_shoulder |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:-------------|:-----------|:--------------|:--------------|:--------|:---------------|:--------------|:-----------------|:---------|:--------|:----------|:--------|:----------|:------------|:---------|:-----------|:------------|:--------------------|:-----------------|:--------------|:-----------------|:---------------------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 21 |  |  |  |  |  | X | X | X | | X | | | | | | | | | X | X | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | X | | X | X | X | | X | | | X | X | X | X | X | X | X | | | |
| 3 | 13 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | | | | | | | | X | X | X |
|
Beluuuuuuga/Japanese-Instruction-Linux-Command-169 | ---
license: cc-by-nc-4.0
task_categories:
- question-answering
language:
- ja
size_categories:
- n<1K
--- |
argilla/news-fakenews | ---
language:
- en
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
list:
- name: label
dtype: string
- name: score
dtype: float64
- name: prediction_agent
dtype: string
- name: annotation
dtype: 'null'
- name: annotation_agent
dtype: 'null'
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: timestamp[us]
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 227222498
num_examples: 44898
download_size: 138350597
dataset_size: 227222498
---
# Dataset Card for "news-fakenews"
## Dataset Description
- **Homepage:** Kaggle Challenge
- **Repository:** https://www.kaggle.com/datasets/clmentbisaillon/fake-and-real-news-dataset?select=True.csv
- **Paper:** N.A.
- **Leaderboard:** N.A.
- **Point of Contact:** N.A.
### Dataset Summary
Can you use this data set to make an algorithm able to determine if an article is fake news or not ?
### Languages
english
### Citation Information
Acknowledgements
Ahmed H, Traore I, Saad S. “Detecting opinion spams and fake news using text classification”, Journal of Security and Privacy, Volume 1, Issue 1, Wiley, January/February 2018.
Ahmed H, Traore I, Saad S. (2017) “Detection of Online Fake News Using N-Gram Analysis and Machine Learning Techniques. In: Traore I., Woungang I., Awad A. (eds) Intelligent, Secure, and Dependable Systems in Distributed and Cloud Environments. ISDDC 2017. Lecture Notes in Computer Science, vol 10618. Springer, Cham (pp. 127-138).
### Contributions
Thanks to [@davidberenstein1957](https://github.com/davidberenstein1957) for adding this dataset. |
roman_urdu_hate_speech | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
language:
- ur
license:
- mit
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
pretty_name: roman_urdu_hate_speech
tags:
- binary classification
dataset_info:
- config_name: Coarse_Grained
features:
- name: tweet
dtype: string
- name: label
dtype:
class_label:
names:
'0': Abusive/Offensive
'1': Normal
splits:
- name: train
num_bytes: 725719
num_examples: 7208
- name: test
num_bytes: 218087
num_examples: 2002
- name: validation
num_bytes: 79759
num_examples: 800
download_size: 927937
dataset_size: 1023565
- config_name: Fine_Grained
features:
- name: tweet
dtype: string
- name: label
dtype:
class_label:
names:
'0': Abusive/Offensive
'1': Normal
'2': Religious Hate
'3': Sexism
'4': Profane/Untargeted
splits:
- name: train
num_bytes: 723670
num_examples: 7208
- name: test
num_bytes: 219359
num_examples: 2002
- name: validation
num_bytes: 723670
num_examples: 7208
download_size: 1519423
dataset_size: 1666699
---
# Dataset Card for roman_urdu_hate_speech
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [roman_urdu_hate_speech homepage](https://aclanthology.org/2020.emnlp-main.197/)
- **Repository:** [roman_urdu_hate_speech repository](https://github.com/haroonshakeel/roman_urdu_hate_speech)
- **Paper:** [Hate-Speech and Offensive Language Detection in Roman Urdu](https://aclanthology.org/2020.emnlp-main.197.pdf)
- **Leaderboard:** [N/A]
- **Point of Contact:** [M. Haroon Shakeel](mailto:m.shakeel@lums.edu.pk)
### Dataset Summary
The Roman Urdu Hate-Speech and Offensive Language Detection (RUHSOLD) dataset is a Roman Urdu dataset of tweets annotated by experts in the relevant language. The authors develop the gold-standard for two sub-tasks. First sub-task is based on binary labels of Hate-Offensive content and Normal content (i.e., inoffensive language). These labels are self-explanatory. The authors refer to this sub-task as coarse-grained classification. Second sub-task defines Hate-Offensive content with four labels at a granular level. These labels are the most relevant for the demographic of users who converse in RU and are defined in related literature. The authors refer to this sub-task as fine-grained classification. The objective behind creating two gold-standards is to enable the researchers to evaluate the hate speech detection approaches on both easier (coarse-grained) and challenging (fine-grained) scenarios.
### Supported Tasks and Leaderboards
- 'multi-class-classification', 'text-classification-other-binary classification': The dataset can be used for both multi class classification as well as for binary classification as it contains both coarse grained and fine grained labels.
### Languages
The text of this dataset is Roman Urdu. The associated BCP-47 code is 'ur'.
## Dataset Structure
### Data Instances
The dataset consists of two parts divided as a set of two types, Coarse grained examples and Fine Grained examples. The difference is that in the coarse grained example the tweets are labelled as abusive or normal whereas in the fine grained version there are several classes of hate associated with a tweet.
For the Coarse grained segment of the dataset the label mapping is:-
Task 1: Coarse-grained Classification Labels
0: Abusive/Offensive
1: Normal
Whereas for the Fine Grained segment of the dataset the label mapping is:-
Task 2: Fine-grained Classification Labels
0: Abusive/Offensive
1: Normal
2: Religious Hate
3: Sexism
4: Profane/Untargeted
An example from Roman Urdu Hate Speech looks as follows:
```
{
'tweet': 'there are some yahodi daboo like imran chore zakat khore'
'label': 0
}
```
### Data Fields
-tweet:a string denoting the tweet which has been selected by using a random sampling from a tweet base of 50000 tweets to select 10000 tweets and annotated for the dataset.
-label:An annotation manually labeled by three independent annotators, during the annotation process, all conflicts are resolved by a majority vote among three annotators.
### Data Splits
The data of each of the segments, Coarse Grained and Fine Grained is further split into training, validation and test set. The data is split in train, test, and validation sets with 70,20,10 split ratio using stratification based on fine-grained labels.
The use of stratified sampling is deemed necessary to preserve the same labels ratio across all splits.
The Final split sizes are as follows:
Train Valid Test
7209 2003 801
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
The dataset was created by Hammad Rizwan, Muhammad Haroon Shakeel, Asim Karim during work done at Department of Computer Science, Lahore University of Management Sciences (LUMS), Lahore, Pakistan.
### Licensing Information
The licensing status of the dataset hinges on the legal status of the [Roman Urdu Hate Speech Dataset Repository](https://github.com/haroonshakeel/roman_urdu_hate_speech) which is under MIT License.
### Citation Information
```bibtex
@inproceedings{rizwan2020hate,
title={Hate-speech and offensive language detection in roman Urdu},
author={Rizwan, Hammad and Shakeel, Muhammad Haroon and Karim, Asim},
booktitle={Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)},
pages={2512--2522},
year={2020}
}
```
### Contributions
Thanks to [@bp-high](https://github.com/bp-high), for adding this dataset. |
emaeon/train7 | ---
dataset_info:
features:
- name: code1
dtype: string
- name: code2
dtype: string
- name: similar
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9013855023
num_examples: 5000000
download_size: 4017642295
dataset_size: 9013855023
---
# Dataset Card for "train7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/British_English_Average_Tone_Speech_Synthesis_Corpus | ---
task_categories:
- text-to-speech
language:
- en
---
# Dataset Card for Nexdata/British_English_Average_Tone_Speech_Synthesis_Corpus
## Description
10 People - British English Average Tone Speech Synthesis Corpus. It is recorded by British English native speakers, with authentic accent. The phoneme coverage is balanced. Professional phonetician participates in the annotation. It precisely matches with the research and development needs of the speech synthesis.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1309?source=Huggingface
# Specifications
## Format
48,000Hz, 24bit, uncompressed wav, mono channel;
## Recording environment
professional recording studio;
## Recording content
general narrative sentences, interrogative sentences, etc;
## Speaker
british native speaker, 5 male and 5 female, 2 hours per person;
## Device
microphone;
## Language
British English;
## Annotation
word and phoneme transcription, four-level prosodic boundary annotation;
## Application scenarios
speech synthesis.
# Licensing Information
Commercial License |
irds/wikir_en59k | ---
pretty_name: '`wikir/en59k`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikir/en59k`
The `wikir/en59k` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikir#wikir/en59k).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=2,454,785
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikir_en59k', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Frej2020Wikir,
title={WIKIR: A Python toolkit for building a large-scale Wikipedia-based English Information Retrieval Dataset},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={LREC},
year={2020}
}
@inproceedings{Frej2020MlWikir,
title={MLWIKIR: A Python Toolkit for Building Large-scale Wikipedia-based Information Retrieval Datasets in Chinese, English, French, Italian, Japanese, Spanish and More},
author={Jibril Frej and Didier Schwab and Jean-Pierre Chevallet},
booktitle={CIRCLE},
year={2020}
}
```
|
aarnow/auditory-skills-test | ---
language:
- en
license: mit
dataset_info:
features:
- name: label
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 16536
num_examples: 178
download_size: 7410
dataset_size: 16536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
The purpose of this project is to build a dataset and model to enable an AI powered diagnostic tool that assesses a child's auditory skills and recommends resources and therapies that can bring them to the next stage. The primary user base of this tool is intended to be the parents of a child with hearing loss however it is the hope of the creators of this tool that speech and language pathologists (SLPs) and other early intervention and pediatric practitioners can find use.
The model uses a natural language processing (NLP) model for text-classification and converts free text inputted by the parent of a child with hearing loss into 1 of 4 clinical categories: DETECTION, DISCRIMINATION, IDENTIFICATION, CLASSIFICATION.
Based on the classification of the child against a given skill a recommendation is made for therapies that can be used to improve the child's competency against a given skill. The value of this approach is that each child is challenged to build upon existing skills while not being given any task too difficult that will result in discouragement. |
Ga88/Clovis8 | ---
license: openrail
---
|
reglab/land-app-trial | ---
license: cc-by-4.0
task_categories:
- object-detection
language:
- en
tags:
- agriculture
- environment
size_categories:
- 1K<n<10K
---
# Land application field trial data
### Intro
This dataset is a repository of results from our Land Application Detection Model trial with two organizations.
Land application is the process of disposing of agricultural animal waste by spraying it onto fields. [We developed a model](https://github.com/reglab/land-application-detection?tab=readme-ov-file) to detect these practices.
This dataset represents the results of a real world trial to verify and label these detected spreads.
### Data description
#### Structured data
- sent_to_wdnr.csv
- Each row is a detected spread that we forwarded to our partners at WDNR
- sent_to_elpc.csv
- Each row is a detected spread that we forwarded to our partners at ELPC
- wdnr_responses.csv
- Each row is a response to a detection from sent_to_wdnr.csv which contains a preliminary determination by WDNR staff as to whether the image looks like a spread and if it was determined to be likely spreading, the results of an investigation into said spread.
- elpc_responses_raw.csv
- Each row is a response to a detection from sent_to_elpc.csv which is the results of the ELPC investigation into that detection through the use of citizen volunteers verifiying in person.
- elpc_responses_clean.csv
- Same as the raw file but with corrected detection ids to deal with a data entry error.
#### Image data
- images/
- This directory contains .jpeg images of satellite data fed into the model that were sent to either of the partners. Images were captured by [Planet](https://www.planet.com/) using the PlanetScope sensor, visual spectrum 3m images.
## Citation
`@misc {stanford_regulation,_evaluation,_and_governance_lab_2024,
author = { {Stanford Regulation, Evaluation, and Governance Lab} },
title = { land-app-trial (Revision b3d0e11) },
year = 2024,
url = { https://huggingface.co/datasets/reglab/land-app-trial },
doi = { 10.57967/hf/1733 },
publisher = { Hugging Face }
}`
|
zolak/twitter_dataset_79_1713209820 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 1387710
num_examples: 3432
download_size: 680273
dataset_size: 1387710
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_16 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1243273188.0
num_examples: 242259
download_size: 1270215878
dataset_size: 1243273188.0
---
# Dataset Card for "chunk_16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_140 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1147381768
num_examples: 223574
download_size: 1170376996
dataset_size: 1147381768
---
# Dataset Card for "chunk_140"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
roydcarlson/dirt_teff2 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 6436424.0
num_examples: 7
download_size: 6352411
dataset_size: 6436424.0
---
# Dataset Card for "dirt_teff2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tollefj/big-bang-theory-splits-removal | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: en
dtype: string
- name: 'no'
dtype: string
splits:
- name: train
num_bytes: 8082
num_examples: 60
- name: test
num_bytes: 3499
num_examples: 27
download_size: 12705
dataset_size: 11581
---
# Dataset Card for "big-bang-theory-splits-removal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mr-tydi_th_test | ---
pretty_name: '`mr-tydi/th/test`'
viewer: false
source_datasets: ['irds/mr-tydi_th']
task_categories:
- text-retrieval
---
# Dataset Card for `mr-tydi/th/test`
The `mr-tydi/th/test` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mr-tydi#mr-tydi/th/test).
# Data
This dataset provides:
- `queries` (i.e., topics); count=1,190
- `qrels`: (relevance assessments); count=1,368
- For `docs`, use [`irds/mr-tydi_th`](https://huggingface.co/datasets/irds/mr-tydi_th)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mr-tydi_th_test', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mr-tydi_th_test', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Zhang2021MrTyDi,
title={{Mr. TyDi}: A Multi-lingual Benchmark for Dense Retrieval},
author={Xinyu Zhang and Xueguang Ma and Peng Shi and Jimmy Lin},
year={2021},
journal={arXiv:2108.08787},
}
@article{Clark2020TyDiQa,
title={{TyDi QA}: A Benchmark for Information-Seeking Question Answering in Typologically Diverse Languages},
author={Jonathan H. Clark and Eunsol Choi and Michael Collins and Dan Garrette and Tom Kwiatkowski and Vitaly Nikolaev and Jennimaria Palomaki},
year={2020},
journal={Transactions of the Association for Computational Linguistics}
}
```
|
LambdaTests/VQAv2_sample_validation_benchmarks_partition_global_7_loca_7 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 13
num_examples: 1
download_size: 0
dataset_size: 13
---
# Dataset Card for "VQAv2_sample_validation_benchmarks_partition_global_7_loca_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/1cc7040b | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 178
num_examples: 10
download_size: 1340
dataset_size: 178
---
# Dataset Card for "1cc7040b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SahilSN/DataSet_v3 | ---
license: unknown
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 11291
num_examples: 50
download_size: 6939
dataset_size: 11291
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
banloada/ban | ---
license: other
---
|
OzoneAsai/factorExpander | ---
license: wtfpl
task_categories:
- conversational
language:
- ja
---
# Polynomial Expansion and Factoring Dataset
This dataset contains problem and solution pairs for polynomial expansion and factoring. Each problem is a result of expanding and factoring the `(x + n)^2` form expression, where `n` takes values from -1000 to 1000.
## Dataset Structure
- `factorized_dataset.csv`: CSV file containing the dataset.
- `README.md`: This file that provides an overview and usage instructions for the dataset.
## Data Format
The CSV file of the dataset includes the following columns:
- `Instruction`: A description of the problem. It indicates the expression that should be expanded or factored.
- `Output`: The answer expression.
## Sample Data
| Instruction | Output |
|-----------------------------------|--------------------|
| 式を展開せよ: (x - 1000)**2 | x**2 - 2000*x + 1000000 |
...
## Dataset Information
- Number of samples: 2002 (due to the range of n from -1000 to 1000)
- Dataset format: CSV file
# 数式の展開と因数分解データセット
このデータセットは、数式の展開と因数分解の問題と解答のペアを含んでいます。各問題は、`sympy`ライブラリを使用して、`(x + n)^2`形式の数式を展開し、因数分解した結果です。
## データセットの構造
- `factorized_dataset.csv`: データセットが格納されたCSVファイルです。
- `README.md`: このファイルで、データセットの概要と使用方法を説明します。
## データのフォーマット
データセットのCSVファイルには、以下のカラムが含まれています:
- `Instruction`: 問題の説明文です。展開または因数分解を行うべき数式が記載されています。
- `Output`: 解答の数式が記載されています。
## データのサンプル
| Instruction | Output |
|-----------------------------------|--------------------|
| 式を展開せよ: (x - 1000)**2 | x**2 - 2000*x + 1000000 |
...
## データセットの情報
- データのサンプル数: 2002 (nの値を-1000から1000まで総当たりしたため)
- データセットのフォーマット: CSVファイル |
ingTikna/Prolog_Dataset | ---
license: mit
---
|
SF-Corpus/EF_Supersense_Tags | ---
language:
- en
pretty_name: sf-nexus-ef-supsersense-tags
---
# Dataset Card for SF Nexus Extracted Features: Named Entities
## Dataset Description
- **Homepage: https://sfnexus.io/**
- **Repository: https://github.com/SF-Nexus/extracted-features-notebooks**
- **Point of Contact: Alex Wermer-Colan**
### Dataset Summary
The SF Nexus EF Supersense Tags dataset contains supersense tags generated from 403 mid-twentieth century science fiction books, originally digitized from Temple University Libraries' Paskow Science Fiction Collection.
After digitization, the books were cleaned using Abbyy FineReader.
The dataframes in this repository were generated using BookNLP and contain information about the "supersense tags" in the texts.
### About the SF Nexus Corpus
The Paskow Science Fiction collection contains primarily materials from post-WWII, especially mass-market works of the New Wave era (often dated to 1964-1980).
The digitized texts have also been ingested into HathiTrust's repository for preservation and data curation; they are now viewable on HathiTrust's [Temple page](https://babel.hathitrust.org/cgi/ls?field1=ocr;q1=%2A;a=srchls;facet=htsource%3A%22Temple%20University%22;pn=4) for non-consumptive research.
For more information on the project to digitize and curate a corpus of "New Wave" science fiction, see Alex Wermer-Colan's post on the Temple University Scholars Studio blog, ["Building a New Wave Science Fiction Corpus."](https://sites.temple.edu/tudsc/2017/12/20/building-new-wave-science-fiction-corpus/).
### Languages
English
## Dataset Structure
This dataset contains 403 csv files containing information about the named entities in each text in the SF corpus. For example:
```
First line of dataframe: 1908_HODGSON_THEHOUSEONTHEBORDERLAND.txt.supersense.csv
{'start_token': 4,
'end_token': 4
'supersense_category': noun.location
'text': 'Borderland',
}
```
### Data Fields
- **start_token: int** The start token of entity name
- **end_token: int** The end token of the entity name; same as the start token for one-word entites; increase by one for each additional word that is part of the token
- **supersense_category: str** The part of speech and category to which the text belongs
- **text: str** The text corresponding to the supersense tag
### Loading the Dataset
Use the following code to load the dataset in a Python environment (note: does not work with repo set to private)
```
from datasets import load_dataset
# If the dataset is gated/private, make sure you have run huggingface-cli login
dataset = load_dataset("SF-Corpus/EF_Supersense_Tags")
```
Or just clone the dataset repo
```
git lfs install
git clone https://huggingface.co/datasets/SF-Corpus/EF_Supersense_Tags
# if you want to clone without large files – just their pointers
# prepend your git clone with the following env var:
GIT_LFS_SKIP_SMUDGE=1
```
## Dataset Creation
### Curation Rationale
For an overview of our approach to data curation of literary texts, see Alex Wermer-Colan’s and James Kopaczewski’s article, “The New Wave of Digital Collections: Speculating on the Future of Library Curation”(2022)
### Source Data
The Loretta C. Duckworth Scholars Studio has partnered with Temple University Libraries’ Special Collections Research Center (SCRC) and Digital Library Initiatives (DLI) to build a digitized corpus of copyrighted science fiction literature. Besides its voluminous Urban Archives, the SCRC also houses a significant collection of science-fiction literature. The Paskow Science Fiction Collection was originally established in 1972, when Temple acquired 5,000 science fiction paperbacks from a Temple alumnus, the late David C. Paskow. Subsequent donations, including troves of fanzines and the papers of such sci-fi writers as John Varley and Stanley G. Weinbaum, expanded the collection over the last few decades, both in size and in the range of genres. SCRC staff and undergraduate student workers recently performed the usual comparison of gift titles against cataloged books, removing science fiction items that were exact duplicates of existing holdings. A refocusing of the SCRC’s collection development policy for science fiction de-emphasized fantasy and horror titles, so some titles in those genres were removed as well.
## Considerations for Using the Data
This data card only exhibits extracted features for copyrighted fiction; no copyrighted work is being made available for consumption. These digitized files are made accessible for purposes of education and research. Temple University Libraries have given attribution to rights holders when possible. If you hold the rights to materials in our digitized collections that are unattributed, please let us know so that we may maintain accurate information about these materials.
If you are a rights holder and are concerned that you have found material on this website for which you have not granted permission (or is not covered by a copyright exception under US copyright laws), you may request the removal of the material from our site by writing to digitalscholarship@temple.edu.
For more information on non-consumptive research, check out HathiTrust Research Center’s Non-Consumptive Use Research Policy.
## Additional Information
### Dataset Curators
For a full list of conributors to the SF Nexus project, visit [https://sfnexus.io/people/](https://sfnexus.io/people/). |
carnival13/massive_eng_DA_tokenized | ---
dataset_info:
features:
- name: pass_label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 97244320
num_examples: 138200
download_size: 22020759
dataset_size: 97244320
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "massive_eng_DA_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/OxfordFlowers_test_google_flan_t5_xl_mode_A_ns_6149 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 2470439
num_examples: 6149
download_size: 269782
dataset_size: 2470439
---
# Dataset Card for "OxfordFlowers_test_google_flan_t5_xl_mode_A_ns_6149"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Vikhrmodels/habr_qa_sbs | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: question
dtype: string
- name: best
dtype: string
- name: bad
dtype: string
splits:
- name: train
num_bytes: 119263751
num_examples: 102558
download_size: 66726288
dataset_size: 119263751
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- ru
tags:
- code
- finance
pretty_name: habr_qa_sbs
size_categories:
- 10K<n<100K
---
# Habr sbs qa
Датасет основан на сайте habr qa, лучший ответ - тот на котором есть лайки, худший - тот на котором меньше всего лайков.
Датасет собран [Love.Death.Transformers.](https://t.me/lovedeathtransformers) и [Дата-Утренник](https://t.me/data_morning)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_wnli_indef_one | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 5997
num_examples: 29
- name: test
num_bytes: 27482
num_examples: 94
- name: train
num_bytes: 40847
num_examples: 195
download_size: 32547
dataset_size: 74326
---
# Dataset Card for "MULTI_VALUE_wnli_indef_one"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2 | ---
pretty_name: Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-04T20:26:53.463273](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2/blob/main/results_2024-02-04T20-26-53.463273.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2605954222902013,\n\
\ \"acc_stderr\": 0.030887287206153434,\n \"acc_norm\": 0.2609822344299048,\n\
\ \"acc_norm_stderr\": 0.031636108991043924,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602574,\n \"mc2\": 0.372644846918848,\n\
\ \"mc2_stderr\": 0.014009270688888235\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29436860068259385,\n \"acc_stderr\": 0.013318528460539422,\n\
\ \"acc_norm\": 0.32764505119453924,\n \"acc_norm_stderr\": 0.013715847940719346\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4347739494124676,\n\
\ \"acc_stderr\": 0.004947141797384123,\n \"acc_norm\": 0.5791674965146385,\n\
\ \"acc_norm_stderr\": 0.004926837572202166\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.03317672787533157,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.03317672787533157\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106134,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.04096985139843672,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.04096985139843672\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653695,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653695\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n\
\ \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.03119584087770031,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.03119584087770031\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2846153846153846,\n \"acc_stderr\": 0.022878322799706287,\n\
\ \"acc_norm\": 0.2846153846153846,\n \"acc_norm_stderr\": 0.022878322799706287\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3425925925925926,\n \"acc_stderr\": 0.03236585252602158,\n \"\
acc_norm\": 0.3425925925925926,\n \"acc_norm_stderr\": 0.03236585252602158\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2911392405063291,\n \"acc_stderr\": 0.029571601065753374,\n\
\ \"acc_norm\": 0.2911392405063291,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.02812096650391441,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.02812096650391441\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n\
\ \"acc_stderr\": 0.016117318166832283,\n \"acc_norm\": 0.2835249042145594,\n\
\ \"acc_norm_stderr\": 0.016117318166832283\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.025670259242188947,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.025670259242188947\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872395,\n \
\ \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872395\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178475,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178475\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.025767252010855963,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.025767252010855963\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.32727272727272727,\n \"acc_stderr\": 0.044942908662520896,\n\
\ \"acc_norm\": 0.32727272727272727,\n \"acc_norm_stderr\": 0.044942908662520896\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n\
\ \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n\
\ \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.21393034825870647,\n \"acc_stderr\": 0.028996909693328927,\n\
\ \"acc_norm\": 0.21393034825870647,\n \"acc_norm_stderr\": 0.028996909693328927\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683227,\n\
\ \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683227\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.22643818849449204,\n \"mc1_stderr\": 0.014651337324602574,\n\
\ \"mc2\": 0.372644846918848,\n \"mc2_stderr\": 0.014009270688888235\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.6479873717442778,\n\
\ \"acc_stderr\": 0.013422874824929714\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.028051554207733132,\n \"acc_stderr\": 0.004548229533836337\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|arc:challenge|25_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|arc:challenge|25_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|arc:challenge|25_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|gsm8k|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|gsm8k|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|gsm8k|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hellaswag|10_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hellaswag|10_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hellaswag|10_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T18-17-11.697806.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T18-52-11.664162.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-04T20-26-53.463273.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- '**/details_harness|winogrande|5_2024-02-04T18-17-11.697806.parquet'
- split: 2024_02_04T18_52_11.664162
path:
- '**/details_harness|winogrande|5_2024-02-04T18-52-11.664162.parquet'
- split: 2024_02_04T20_26_53.463273
path:
- '**/details_harness|winogrande|5_2024-02-04T20-26-53.463273.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-04T20-26-53.463273.parquet'
- config_name: results
data_files:
- split: 2024_02_04T18_17_11.697806
path:
- results_2024-02-04T18-17-11.697806.parquet
- split: 2024_02_04T18_52_11.664162
path:
- results_2024-02-04T18-52-11.664162.parquet
- split: 2024_02_04T20_26_53.463273
path:
- results_2024-02-04T20-26-53.463273.parquet
- split: latest
path:
- results_2024-02-04T20-26-53.463273.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2](https://huggingface.co/Josephgflowers/Tinyllama-1.3B-Cinder-Reason-Test-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-04T20:26:53.463273](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-1.3B-Cinder-Reason-Test-2/blob/main/results_2024-02-04T20-26-53.463273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2605954222902013,
"acc_stderr": 0.030887287206153434,
"acc_norm": 0.2609822344299048,
"acc_norm_stderr": 0.031636108991043924,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.372644846918848,
"mc2_stderr": 0.014009270688888235
},
"harness|arc:challenge|25": {
"acc": 0.29436860068259385,
"acc_stderr": 0.013318528460539422,
"acc_norm": 0.32764505119453924,
"acc_norm_stderr": 0.013715847940719346
},
"harness|hellaswag|10": {
"acc": 0.4347739494124676,
"acc_stderr": 0.004947141797384123,
"acc_norm": 0.5791674965146385,
"acc_norm_stderr": 0.004926837572202166
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.03317672787533157,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.03317672787533157
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106134,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.04096985139843672,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.04096985139843672
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.03119584087770031,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.03119584087770031
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2846153846153846,
"acc_stderr": 0.022878322799706287,
"acc_norm": 0.2846153846153846,
"acc_norm_stderr": 0.022878322799706287
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473834,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473834
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3425925925925926,
"acc_stderr": 0.03236585252602158,
"acc_norm": 0.3425925925925926,
"acc_norm_stderr": 0.03236585252602158
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2911392405063291,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.2911392405063291,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02812096650391441,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02812096650391441
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.016117318166832283,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.016117318166832283
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188947,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188947
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872395,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872395
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178475,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178475
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.025767252010855963,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.025767252010855963
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.044942908662520896,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.044942908662520896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328927,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328927
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683227,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683227
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602574,
"mc2": 0.372644846918848,
"mc2_stderr": 0.014009270688888235
},
"harness|winogrande|5": {
"acc": 0.6479873717442778,
"acc_stderr": 0.013422874824929714
},
"harness|gsm8k|5": {
"acc": 0.028051554207733132,
"acc_stderr": 0.004548229533836337
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Samburskoy/TT3 | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.