datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_Josephgflowers__TinyLlama-Cinder-1.3B-Test.2 | ---
pretty_name: Evaluation run of Josephgflowers/TinyLlama-Cinder-1.3B-Test.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/TinyLlama-Cinder-1.3B-Test.2](https://huggingface.co/Josephgflowers/TinyLlama-Cinder-1.3B-Test.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__TinyLlama-Cinder-1.3B-Test.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-27T17:11:20.414206](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-Cinder-1.3B-Test.2/blob/main/results_2024-01-27T17-11-20.414206.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2632532516237253,\n\
\ \"acc_stderr\": 0.03096899556861394,\n \"acc_norm\": 0.26383605058795256,\n\
\ \"acc_norm_stderr\": 0.03172701859516164,\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757438,\n \"mc2\": 0.3798174614120003,\n\
\ \"mc2_stderr\": 0.01429160027055937\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.31143344709897613,\n \"acc_stderr\": 0.013532472099850945,\n\
\ \"acc_norm\": 0.3370307167235495,\n \"acc_norm_stderr\": 0.01381347665290227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4422425811591316,\n\
\ \"acc_stderr\": 0.004956378590571539,\n \"acc_norm\": 0.5866361282613025,\n\
\ \"acc_norm_stderr\": 0.004914305798575694\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173042,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173042\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924812,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924812\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770861,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770861\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.26129032258064516,\n\
\ \"acc_stderr\": 0.024993053397764822,\n \"acc_norm\": 0.26129032258064516,\n\
\ \"acc_norm_stderr\": 0.024993053397764822\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267052,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267052\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2694300518134715,\n \"acc_stderr\": 0.032018671228777947,\n\
\ \"acc_norm\": 0.2694300518134715,\n \"acc_norm_stderr\": 0.032018671228777947\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335068,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335068\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380575,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380575\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23302752293577983,\n \"acc_stderr\": 0.0181256691808615,\n \"\
acc_norm\": 0.23302752293577983,\n \"acc_norm_stderr\": 0.0181256691808615\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.02933116229425173,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02933116229425173\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952685,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952685\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.280970625798212,\n\
\ \"acc_stderr\": 0.016073127851221246,\n \"acc_norm\": 0.280970625798212,\n\
\ \"acc_norm_stderr\": 0.016073127851221246\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341005,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341005\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.2765273311897106,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.21631205673758866,\n \"acc_stderr\": 0.024561720560562793,\n \
\ \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.024561720560562793\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n\
\ \"acc_stderr\": 0.010906282617981645,\n \"acc_norm\": 0.23989569752281617,\n\
\ \"acc_norm_stderr\": 0.010906282617981645\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3382352941176471,\n \"acc_stderr\": 0.02873932851398358,\n\
\ \"acc_norm\": 0.3382352941176471,\n \"acc_norm_stderr\": 0.02873932851398358\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.044942908662520896,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.044942908662520896\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.023897144768914524,\n\
\ \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.023897144768914524\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21909424724602203,\n\
\ \"mc1_stderr\": 0.014480038578757438,\n \"mc2\": 0.3798174614120003,\n\
\ \"mc2_stderr\": 0.01429160027055937\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120225\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \
\ \"acc_stderr\": 0.003970449129848635\n }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/TinyLlama-Cinder-1.3B-Test.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|arc:challenge|25_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|gsm8k|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hellaswag|10_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-11-20.414206.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-27T17-11-20.414206.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- '**/details_harness|winogrande|5_2024-01-27T17-11-20.414206.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-27T17-11-20.414206.parquet'
- config_name: results
data_files:
- split: 2024_01_27T17_11_20.414206
path:
- results_2024-01-27T17-11-20.414206.parquet
- split: latest
path:
- results_2024-01-27T17-11-20.414206.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/TinyLlama-Cinder-1.3B-Test.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/TinyLlama-Cinder-1.3B-Test.2](https://huggingface.co/Josephgflowers/TinyLlama-Cinder-1.3B-Test.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__TinyLlama-Cinder-1.3B-Test.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-27T17:11:20.414206](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__TinyLlama-Cinder-1.3B-Test.2/blob/main/results_2024-01-27T17-11-20.414206.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2632532516237253,
"acc_stderr": 0.03096899556861394,
"acc_norm": 0.26383605058795256,
"acc_norm_stderr": 0.03172701859516164,
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757438,
"mc2": 0.3798174614120003,
"mc2_stderr": 0.01429160027055937
},
"harness|arc:challenge|25": {
"acc": 0.31143344709897613,
"acc_stderr": 0.013532472099850945,
"acc_norm": 0.3370307167235495,
"acc_norm_stderr": 0.01381347665290227
},
"harness|hellaswag|10": {
"acc": 0.4422425811591316,
"acc_stderr": 0.004956378590571539,
"acc_norm": 0.5866361282613025,
"acc_norm_stderr": 0.004914305798575694
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173042,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173042
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924812,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924812
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770861,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770861
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764822,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764822
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267052,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267052
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2694300518134715,
"acc_stderr": 0.032018671228777947,
"acc_norm": 0.2694300518134715,
"acc_norm_stderr": 0.032018671228777947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335068,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335068
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380575,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380575
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473834,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473834
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23302752293577983,
"acc_stderr": 0.0181256691808615,
"acc_norm": 0.23302752293577983,
"acc_norm_stderr": 0.0181256691808615
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02933116229425173,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02933116229425173
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952685,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952685
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.280970625798212,
"acc_stderr": 0.016073127851221246,
"acc_norm": 0.280970625798212,
"acc_norm_stderr": 0.016073127851221246
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341005,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341005
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.21631205673758866,
"acc_stderr": 0.024561720560562793,
"acc_norm": 0.21631205673758866,
"acc_norm_stderr": 0.024561720560562793
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981645,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981645
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3382352941176471,
"acc_stderr": 0.02873932851398358,
"acc_norm": 0.3382352941176471,
"acc_norm_stderr": 0.02873932851398358
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.044942908662520896,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.044942908662520896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.023897144768914524,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.023897144768914524
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21909424724602203,
"mc1_stderr": 0.014480038578757438,
"mc2": 0.3798174614120003,
"mc2_stderr": 0.01429160027055937
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120225
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-ml6team__cnn_dailymail_nl-a771a5f9-12165620 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- ml6team/cnn_dailymail_nl
eval_info:
task: summarization
model: yhavinga/t5-v1.1-base-dutch-cnn-test
metrics: []
dataset_name: ml6team/cnn_dailymail_nl
dataset_config: default
dataset_split: test
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: yhavinga/t5-v1.1-base-dutch-cnn-test
* Dataset: ml6team/cnn_dailymail_nl
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@yhavinga](https://huggingface.co/yhavinga) for evaluating this model. |
jizhongpeng/AIGCQA-30K | ---
license: unknown
---
AIGCQA-30K dataset ready for Q-Align training |
adalib/monkey-cond-gen-sub-40 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: api
dtype: string
splits:
- name: train
num_bytes: 17605163
num_examples: 1217
download_size: 5980731
dataset_size: 17605163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AdapterOcean/Open_Platypus_standardized_cluster_7 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 7461771
num_examples: 673
download_size: 2259471
dataset_size: 7461771
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Open_Platypus_standardized_cluster_7"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wbxlala/har_str_100 | ---
license: cc-by-4.0
---
|
danielpleus/wiki-nds | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 92432660
num_examples: 84158
download_size: 47740161
dataset_size: 92432660
---
# Dataset Card for "wiki-nds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tr416/dataset_20231006_192150 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 73785
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231006_192150"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_he_inanimate_objects | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 303835
num_examples: 1513
- name: test
num_bytes: 3021964
num_examples: 14873
- name: train
num_bytes: 2801005
num_examples: 13532
download_size: 3860132
dataset_size: 6126804
---
# Dataset Card for "MULTI_VALUE_qqp_he_inanimate_objects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Mohammed-Altaf__Medical-ChatBot | ---
pretty_name: Evaluation run of Mohammed-Altaf/Medical-ChatBot
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mohammed-Altaf/Medical-ChatBot](https://huggingface.co/Mohammed-Altaf/Medical-ChatBot)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mohammed-Altaf__Medical-ChatBot_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-23T17:51:39.546236](https://huggingface.co/datasets/open-llm-leaderboard/details_Mohammed-Altaf__Medical-ChatBot_public/blob/main/results_2023-11-23T17-51-39.546236.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26179013153474723,\n\
\ \"acc_stderr\": 0.030983466516240496,\n \"acc_norm\": 0.262541940150786,\n\
\ \"acc_norm_stderr\": 0.031759054123644256,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.41044189971272244,\n\
\ \"mc2_stderr\": 0.015229110119195517,\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.000405845113241773,\n \"f1\": 0.06370071308724842,\n\
\ \"f1_stderr\": 0.0014122765324405353\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2790102389078498,\n \"acc_stderr\": 0.013106784883601336,\n\
\ \"acc_norm\": 0.3046075085324232,\n \"acc_norm_stderr\": 0.013449522109932487\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3324039036048596,\n\
\ \"acc_stderr\": 0.004701121421805424,\n \"acc_norm\": 0.3859788886675961,\n\
\ \"acc_norm_stderr\": 0.004858306877874615\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3037037037037037,\n\
\ \"acc_stderr\": 0.03972552884785137,\n \"acc_norm\": 0.3037037037037037,\n\
\ \"acc_norm_stderr\": 0.03972552884785137\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2774566473988439,\n\
\ \"acc_stderr\": 0.03414014007044036,\n \"acc_norm\": 0.2774566473988439,\n\
\ \"acc_norm_stderr\": 0.03414014007044036\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514203,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514203\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302054,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302054\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.29354838709677417,\n\
\ \"acc_stderr\": 0.025906087021319288,\n \"acc_norm\": 0.29354838709677417,\n\
\ \"acc_norm_stderr\": 0.025906087021319288\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"\
acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32642487046632124,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.32642487046632124,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.30512820512820515,\n \"acc_stderr\": 0.023346335293325884,\n\
\ \"acc_norm\": 0.30512820512820515,\n \"acc_norm_stderr\": 0.023346335293325884\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.02851025151234193,\n \
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.02851025151234193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3376146788990826,\n \"acc_stderr\": 0.020275265986638903,\n \"\
acc_norm\": 0.3376146788990826,\n \"acc_norm_stderr\": 0.020275265986638903\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23039215686274508,\n \"acc_stderr\": 0.02955429260569506,\n \"\
acc_norm\": 0.23039215686274508,\n \"acc_norm_stderr\": 0.02955429260569506\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036416,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036416\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.1210762331838565,\n\
\ \"acc_stderr\": 0.021894174113185737,\n \"acc_norm\": 0.1210762331838565,\n\
\ \"acc_norm_stderr\": 0.021894174113185737\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.17857142857142858,\n\
\ \"acc_stderr\": 0.036352091215778065,\n \"acc_norm\": 0.17857142857142858,\n\
\ \"acc_norm_stderr\": 0.036352091215778065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.027421007295392912,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.027421007295392912\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22860791826309068,\n\
\ \"acc_stderr\": 0.015016884698539894,\n \"acc_norm\": 0.22860791826309068,\n\
\ \"acc_norm_stderr\": 0.015016884698539894\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.27167630057803466,\n \"acc_stderr\": 0.02394851290546835,\n\
\ \"acc_norm\": 0.27167630057803466,\n \"acc_norm_stderr\": 0.02394851290546835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02380518652488814,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02380518652488814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.022122439772480764,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.022122439772480764\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.023246202647819746,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.023246202647819746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24468085106382978,\n \"acc_stderr\": 0.025645553622266726,\n \
\ \"acc_norm\": 0.24468085106382978,\n \"acc_norm_stderr\": 0.025645553622266726\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.01099615663514269,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.01099615663514269\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.17279411764705882,\n \"acc_stderr\": 0.02296606758558179,\n\
\ \"acc_norm\": 0.17279411764705882,\n \"acc_norm_stderr\": 0.02296606758558179\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779593,\n \
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779593\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.03831305140884603,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.03831305140884603\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.3306122448979592,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.1890547263681592,\n\
\ \"acc_stderr\": 0.027686913588013028,\n \"acc_norm\": 0.1890547263681592,\n\
\ \"acc_norm_stderr\": 0.027686913588013028\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.20481927710843373,\n\
\ \"acc_stderr\": 0.03141784291663926,\n \"acc_norm\": 0.20481927710843373,\n\
\ \"acc_norm_stderr\": 0.03141784291663926\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253597,\n \"mc2\": 0.41044189971272244,\n\
\ \"mc2_stderr\": 0.015229110119195517\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5485398579321231,\n \"acc_stderr\": 0.01398611030101776\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \
\ \"em_stderr\": 0.000405845113241773,\n \"f1\": 0.06370071308724842,\n\
\ \"f1_stderr\": 0.0014122765324405353\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.0027210765770416655\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Mohammed-Altaf/Medical-ChatBot
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|arc:challenge|25_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|drop|3_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|gsm8k|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hellaswag|10_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T17-51-39.546236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-23T17-51-39.546236.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- '**/details_harness|winogrande|5_2023-11-23T17-51-39.546236.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-23T17-51-39.546236.parquet'
- config_name: results
data_files:
- split: 2023_11_23T17_51_39.546236
path:
- results_2023-11-23T17-51-39.546236.parquet
- split: latest
path:
- results_2023-11-23T17-51-39.546236.parquet
---
# Dataset Card for Evaluation run of Mohammed-Altaf/Medical-ChatBot
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mohammed-Altaf/Medical-ChatBot
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mohammed-Altaf/Medical-ChatBot](https://huggingface.co/Mohammed-Altaf/Medical-ChatBot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mohammed-Altaf__Medical-ChatBot_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-23T17:51:39.546236](https://huggingface.co/datasets/open-llm-leaderboard/details_Mohammed-Altaf__Medical-ChatBot_public/blob/main/results_2023-11-23T17-51-39.546236.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26179013153474723,
"acc_stderr": 0.030983466516240496,
"acc_norm": 0.262541940150786,
"acc_norm_stderr": 0.031759054123644256,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.41044189971272244,
"mc2_stderr": 0.015229110119195517,
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241773,
"f1": 0.06370071308724842,
"f1_stderr": 0.0014122765324405353
},
"harness|arc:challenge|25": {
"acc": 0.2790102389078498,
"acc_stderr": 0.013106784883601336,
"acc_norm": 0.3046075085324232,
"acc_norm_stderr": 0.013449522109932487
},
"harness|hellaswag|10": {
"acc": 0.3324039036048596,
"acc_stderr": 0.004701121421805424,
"acc_norm": 0.3859788886675961,
"acc_norm_stderr": 0.004858306877874615
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.03972552884785137,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.03972552884785137
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.03414014007044036,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.03414014007044036
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.028504856470514203,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.028504856470514203
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302054,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302054
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.29354838709677417,
"acc_stderr": 0.025906087021319288,
"acc_norm": 0.29354838709677417,
"acc_norm_stderr": 0.025906087021319288
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32642487046632124,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.32642487046632124,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.30512820512820515,
"acc_stderr": 0.023346335293325884,
"acc_norm": 0.30512820512820515,
"acc_norm_stderr": 0.023346335293325884
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.02851025151234193,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.02851025151234193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638903,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.02955429260569506,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.02955429260569506
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036416,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036416
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.1210762331838565,
"acc_stderr": 0.021894174113185737,
"acc_norm": 0.1210762331838565,
"acc_norm_stderr": 0.021894174113185737
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.17857142857142858,
"acc_stderr": 0.036352091215778065,
"acc_norm": 0.17857142857142858,
"acc_norm_stderr": 0.036352091215778065
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392912,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22860791826309068,
"acc_stderr": 0.015016884698539894,
"acc_norm": 0.22860791826309068,
"acc_norm_stderr": 0.015016884698539894
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02380518652488814,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02380518652488814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.022122439772480764,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.022122439772480764
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.023246202647819746,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.023246202647819746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24468085106382978,
"acc_stderr": 0.025645553622266726,
"acc_norm": 0.24468085106382978,
"acc_norm_stderr": 0.025645553622266726
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.01099615663514269,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.01099615663514269
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17279411764705882,
"acc_stderr": 0.02296606758558179,
"acc_norm": 0.17279411764705882,
"acc_norm_stderr": 0.02296606758558179
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779593,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884603,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884603
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3306122448979592,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.3306122448979592,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.1890547263681592,
"acc_stderr": 0.027686913588013028,
"acc_norm": 0.1890547263681592,
"acc_norm_stderr": 0.027686913588013028
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-virology|5": {
"acc": 0.20481927710843373,
"acc_stderr": 0.03141784291663926,
"acc_norm": 0.20481927710843373,
"acc_norm_stderr": 0.03141784291663926
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253597,
"mc2": 0.41044189971272244,
"mc2_stderr": 0.015229110119195517
},
"harness|winogrande|5": {
"acc": 0.5485398579321231,
"acc_stderr": 0.01398611030101776
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241773,
"f1": 0.06370071308724842,
"f1_stderr": 0.0014122765324405353
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416655
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bonadossou/afrolm_active_learning_dataset | ---
annotations_creators:
- crowdsourced
language:
- amh
- orm
- lin
- hau
- ibo
- kin
- lug
- luo
- pcm
- swa
- wol
- yor
- bam
- bbj
- ewe
- fon
- mos
- nya
- sna
- tsn
- twi
- xho
- zul
language_creators:
- crowdsourced
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: afrolm-dataset
size_categories:
- 1M<n<10M
source_datasets:
- original
tags:
- afrolm
- active learning
- language modeling
- research papers
- natural language processing
- self-active learning
task_categories:
- fill-mask
task_ids:
- masked-language-modeling
---
# AfroLM: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 African Languages
- [GitHub Repository of the Paper](https://github.com/bonaventuredossou/MLM_AL)
This repository contains the dataset for our paper [`AfroLM: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 African Languages`](https://arxiv.org/pdf/2211.03263.pdf) which will appear at the third Simple and Efficient Natural Language Processing, at EMNLP 2022.
## Our self-active learning framework

## Languages Covered
AfroLM has been pretrained from scratch on 23 African Languages: Amharic, Afan Oromo, Bambara, Ghomalá, Éwé, Fon, Hausa, Ìgbò, Kinyarwanda, Lingala, Luganda, Luo, Mooré, Chewa, Naija, Shona, Swahili, Setswana, Twi, Wolof, Xhosa, Yorùbá, and Zulu.
## Evaluation Results
AfroLM was evaluated on MasakhaNER1.0 (10 African Languages) and MasakhaNER2.0 (21 African Languages) datasets; on text classification and sentiment analysis. AfroLM outperformed AfriBERTa, mBERT, and XLMR-base, and was very competitive with AfroXLMR. AfroLM is also very data efficient because it was pretrained on a dataset 14x+ smaller than its competitors' datasets. Below the average F1-score performances of various models, across various datasets. Please consult our paper for more language-level performance.
Model | MasakhaNER | MasakhaNER2.0* | Text Classification (Yoruba/Hausa) | Sentiment Analysis (YOSM) | OOD Sentiment Analysis (Twitter -> YOSM) |
|:---: |:---: |:---: | :---: |:---: | :---: |
`AfroLM-Large` | **80.13** | **83.26** | **82.90/91.00** | **85.40** | **68.70** |
`AfriBERTa` | 79.10 | 81.31 | 83.22/90.86 | 82.70 | 65.90 |
`mBERT` | 71.55 | 80.68 | --- | --- | --- |
`XLMR-base` | 79.16 | 83.09 | --- | --- | --- |
`AfroXLMR-base` | `81.90` | `84.55` | --- | --- | --- |
- (*) The evaluation was made on the 11 additional languages of the dataset.
- Bold numbers represent the performance of the model with the **smallest pretrained data**.
## Pretrained Models and Dataset
**Models:**: [AfroLM-Large](https://huggingface.co/bonadossou/afrolm_active_learning) and **Dataset**: [AfroLM Dataset](https://huggingface.co/datasets/bonadossou/afrolm_active_learning_dataset)
## HuggingFace usage of AfroLM-large
```python
from transformers import XLMRobertaModel, XLMRobertaTokenizer
model = XLMRobertaModel.from_pretrained("bonadossou/afrolm_active_learning")
tokenizer = XLMRobertaTokenizer.from_pretrained("bonadossou/afrolm_active_learning")
tokenizer.model_max_length = 256
```
`Autotokenizer` class does not successfully load our tokenizer. So we recommend using directly the `XLMRobertaTokenizer` class. Depending on your task, you will load the according mode of the model. Read the [XLMRoberta Documentation](https://huggingface.co/docs/transformers/model_doc/xlm-roberta)
## Reproducing our result: Training and Evaluation
- To train the network, run `python active_learning.py`. You can also wrap it around a `bash` script.
- For the evaluation:
- NER Classification: `bash ner_experiments.sh`
- Text Classification & Sentiment Analysis: `bash text_classification_all.sh`
## Citation
``@inproceedings{dossou-etal-2022-afrolm,
title = "{A}fro{LM}: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 {A}frican Languages",
author = "Dossou, Bonaventure F. P. and
Tonja, Atnafu Lambebo and
Yousuf, Oreen and
Osei, Salomey and
Oppong, Abigail and
Shode, Iyanuoluwa and
Awoyomi, Oluwabusayo Olufunke and
Emezue, Chris",
booktitle = "Proceedings of The Third Workshop on Simple and Efficient Natural Language Processing (SustaiNLP)",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates (Hybrid)",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.sustainlp-1.11",
pages = "52--64",}``
## Reach out
Do you have a question? Please create an issue and we will reach out as soon as possible |
darinchau/audio-infos | ---
dataset_info:
features:
- name: chords
sequence: int64
- name: chord_times
sequence: float64
- name: beats
sequence: float64
- name: downbeats
sequence: float64
- name: sample_rate
dtype: int64
- name: genre
dtype: string
- name: audio_name
dtype: string
- name: url
dtype: string
- name: playlist
dtype: string
- name: time_accessed
dtype: int64
- name: views
dtype: int64
- name: length
dtype: int64
- name: rating
dtype: string
- name: age_restricted
dtype: bool
splits:
- name: train
num_bytes: 41189756
num_examples: 5810
download_size: 10072769
dataset_size: 41189756
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/kanon_konomori_watashinitenshigamaiorita | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kanon Konomori
This is the dataset of Kanon Konomori, containing 176 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 176 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 421 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 449 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 176 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 176 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 176 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 421 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 421 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 309 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 449 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 449 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
liuyanchen1015/MULTI_VALUE_qqp_it_is_non_referential | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 358821
num_examples: 2127
- name: test
num_bytes: 3738950
num_examples: 21997
- name: train
num_bytes: 3186907
num_examples: 18760
download_size: 4535420
dataset_size: 7284678
---
# Dataset Card for "MULTI_VALUE_qqp_it_is_non_referential"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NexaAI/ControlnetLight | ---
dataset_info:
features:
- name: frame
dtype: string
- name: target
dtype: image
- name: shadow
dtype: image
- name: position
dtype: string
- name: heading
dtype: string
- name: direction
dtype: string
- name: elevation
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 2429715169.0
num_examples: 3000
download_size: 2228306916
dataset_size: 2429715169.0
---
# Dataset Card for "test-blip2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pharaouk/dharma-1 | ---
configs:
- config_name: default
data_files:
- split: 'dharma_1_full'
path: dharma_1_full*
- split: 'dharma_1_mini'
path: dharma_1_mini*
- split: 'dharma_1_micro'
path: dharma_1_micro*
- split: 'dharma_1_unshuffled'
path: dharma_eval_unshuffled*
---
# "Dharma-1"
A new carefully curated benchmark set, designed for a new era where the true end user uses LLM's for zero-shot and one-shot tasks, for a vast majority of the time.
Stop training your models on mindless targets (eval_loss, train_loss), start training your LLM on lightweight Dharma as an eval target.
A mix of all the top benchmarks.
Formed to have an equal distribution of some of the most trusted benchmarks used by those developing SOTA LLMs, comprised of only 3,000 examples for the largest size, as well as 450 and 90 for Dharma-mini and Dharma-micro respectively.
The current version of Dharma is comprised of a curated sampling of the following benchmarks:
- AGIEval
- Bigbench
- MMLU
- Winogrande
- Arc-C
- Arc- E
- OBQA
- TruthfulQA
- Bool-q
Each of these original benchmark datasets have their own subsections, careful work has gone into also choosing an equal distribution of the important subsections of each these, to have the best representation of the original benchmark creators goals.
Dharma-1 is now integrated into Axolotl as well!, so you can focus on optimizing the other aspects of your training pipeline, model architecture and/or dataset, as opposed to worrying about what is the best evaluation measurement or optimization target that will best represent capabilities for the end user.
Benchmarking for top base model will be listed here when completed and verified.
Special thanks to @LDJnr for their contributions. Check out their Puffin dataset here: https://huggingface.co/LDJnr
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_193 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 983356856.0
num_examples: 193118
download_size: 1001961085
dataset_size: 983356856.0
---
# Dataset Card for "chunk_193"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ryanyeo/kirnect_part_01_test_01 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2986
num_examples: 17
download_size: 3404
dataset_size: 2986
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Noppawat-Rew/mimic3_it_gpt | ---
dataset_info:
features:
- name: SUBJECT_ID
dtype: int64
- name: HADM_ID
dtype: int64
- name: TEXT
dtype: string
- name: LABELS
dtype: string
- name: length
dtype: int64
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 20909423
num_examples: 1019
download_size: 10004186
dataset_size: 20909423
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_ddyuudd__dolly-v2-3b | ---
pretty_name: Evaluation run of ddyuudd/dolly-v2-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ddyuudd/dolly-v2-3b](https://huggingface.co/ddyuudd/dolly-v2-3b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ddyuudd__dolly-v2-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-23T01:43:05.637822](https://huggingface.co/datasets/open-llm-leaderboard/details_ddyuudd__dolly-v2-3b/blob/main/results_2024-02-23T01-43-05.637822.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2583531088279041,\n\
\ \"acc_stderr\": 0.030793547311393207,\n \"acc_norm\": 0.2601888180617266,\n\
\ \"acc_norm_stderr\": 0.03153852072302,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.3380010409129166,\n\
\ \"mc2_stderr\": 0.014377199793086676\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35494880546075086,\n \"acc_stderr\": 0.013983036904094094,\n\
\ \"acc_norm\": 0.3967576791808874,\n \"acc_norm_stderr\": 0.014296513020180628\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48894642501493724,\n\
\ \"acc_stderr\": 0.00498856194427739,\n \"acc_norm\": 0.650368452499502,\n\
\ \"acc_norm_stderr\": 0.004758790172436687\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\"\
: 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173041,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173041\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.03831226048850333,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.03831226048850333\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776557,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776557\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020514,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020514\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n\
\ \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.27419354838709675,\n\
\ \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.032876667586034886,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.032876667586034886\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945644,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945644\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817244,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.021107730127243998,\n\
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.021107730127243998\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.027025433498882364,\n\
\ \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.027025433498882364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.25137614678899084,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.25137614678899084,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2361111111111111,\n \"acc_stderr\": 0.02896370257079104,\n \"\
acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.02896370257079104\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869326,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869326\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.28270042194092826,\n \"acc_stderr\": 0.029312814153955934,\n \
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.029312814153955934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.32286995515695066,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.034624199316156234,\n\
\ \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.034624199316156234\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.02891120880274946,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.02891120880274946\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888722,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888722\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25684485006518903,\n\
\ \"acc_stderr\": 0.011158455853098867,\n \"acc_norm\": 0.25684485006518903,\n\
\ \"acc_norm_stderr\": 0.011158455853098867\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1875,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.1875,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.32727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.32727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.024127463462650146,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.024127463462650146\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.035915667978246635,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.035915667978246635\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.035087719298245654,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.035087719298245654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474203,\n \"mc2\": 0.3380010409129166,\n\
\ \"mc2_stderr\": 0.014377199793086676\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5911602209944752,\n \"acc_stderr\": 0.013816954295135691\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02047005307050796,\n \
\ \"acc_stderr\": 0.0039004133859157192\n }\n}\n```"
repo_url: https://huggingface.co/ddyuudd/dolly-v2-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|arc:challenge|25_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|gsm8k|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hellaswag|10_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T01-43-05.637822.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-23T01-43-05.637822.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- '**/details_harness|winogrande|5_2024-02-23T01-43-05.637822.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-23T01-43-05.637822.parquet'
- config_name: results
data_files:
- split: 2024_02_23T01_43_05.637822
path:
- results_2024-02-23T01-43-05.637822.parquet
- split: latest
path:
- results_2024-02-23T01-43-05.637822.parquet
---
# Dataset Card for Evaluation run of ddyuudd/dolly-v2-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ddyuudd/dolly-v2-3b](https://huggingface.co/ddyuudd/dolly-v2-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ddyuudd__dolly-v2-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-23T01:43:05.637822](https://huggingface.co/datasets/open-llm-leaderboard/details_ddyuudd__dolly-v2-3b/blob/main/results_2024-02-23T01-43-05.637822.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2583531088279041,
"acc_stderr": 0.030793547311393207,
"acc_norm": 0.2601888180617266,
"acc_norm_stderr": 0.03153852072302,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.3380010409129166,
"mc2_stderr": 0.014377199793086676
},
"harness|arc:challenge|25": {
"acc": 0.35494880546075086,
"acc_stderr": 0.013983036904094094,
"acc_norm": 0.3967576791808874,
"acc_norm_stderr": 0.014296513020180628
},
"harness|hellaswag|10": {
"acc": 0.48894642501493724,
"acc_stderr": 0.00498856194427739,
"acc_norm": 0.650368452499502,
"acc_norm_stderr": 0.004758790172436687
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173041,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173041
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.03831226048850333,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.03831226048850333
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776557,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776557
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020514,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020514
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.032876667586034886,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.032876667586034886
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945644,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817244,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.021107730127243998,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.021107730127243998
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.027025433498882364,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.027025433498882364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25137614678899084,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.25137614678899084,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.02896370257079104,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.02896370257079104
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869326,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869326
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.029312814153955934,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.029312814153955934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.034624199316156234,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.034624199316156234
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274946,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274946
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888722,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888722
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25684485006518903,
"acc_stderr": 0.011158455853098867,
"acc_norm": 0.25684485006518903,
"acc_norm_stderr": 0.011158455853098867
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1875,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.32727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.32727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.024127463462650146,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.024127463462650146
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.035915667978246635,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.035915667978246635
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.035087719298245654,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.035087719298245654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474203,
"mc2": 0.3380010409129166,
"mc2_stderr": 0.014377199793086676
},
"harness|winogrande|5": {
"acc": 0.5911602209944752,
"acc_stderr": 0.013816954295135691
},
"harness|gsm8k|5": {
"acc": 0.02047005307050796,
"acc_stderr": 0.0039004133859157192
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
isek-ai/danbooru-tags-2024 | ---
dataset_info:
config_name: 202402-at20240326
features:
- name: id
dtype: int64
- name: copyright
dtype: string
- name: character
dtype: string
- name: artist
dtype: string
- name: general
dtype: string
- name: meta
dtype: string
- name: rating
dtype: string
- name: score
dtype: int64
- name: created_at
dtype: string
splits:
- name: train
num_bytes: 3524386508
num_examples: 7124975
download_size: 1303752608
dataset_size: 3524386508
configs:
- config_name: 202402-at20240326
data_files:
- split: train
path: 202402-at20240326/train-*
---
|
freshpearYoon/train_free_59 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 9604563432
num_examples: 10000
download_size: 1276438430
dataset_size: 9604563432
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT | ---
pretty_name: Evaluation run of kevinpro/Vicuna-13B-CoT
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kevinpro/Vicuna-13B-CoT](https://huggingface.co/kevinpro/Vicuna-13B-CoT) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T13:31:22.626797](https://huggingface.co/datasets/open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT/blob/main/results_2023-09-17T13-31-22.626797.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029677013422818792,\n\
\ \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n\
\ \"f1_stderr\": 0.002167792401176146,\n \"acc\": 0.4141695683211732,\n\
\ \"acc_stderr\": 0.010019161585538096\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n\
\ \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.00774004433710381\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n\
\ }\n}\n```"
repo_url: https://huggingface.co/kevinpro/Vicuna-13B-CoT
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T13_31_22.626797
path:
- '**/details_harness|drop|3_2023-09-17T13-31-22.626797.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T13-31-22.626797.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T13_31_22.626797
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-31-22.626797.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-31-22.626797.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:33:25.891730.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:33:25.891730.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T13_31_22.626797
path:
- '**/details_harness|winogrande|5_2023-09-17T13-31-22.626797.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T13-31-22.626797.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_33_25.891730
path:
- results_2023-07-19T18:33:25.891730.parquet
- split: 2023_09_17T13_31_22.626797
path:
- results_2023-09-17T13-31-22.626797.parquet
- split: latest
path:
- results_2023-09-17T13-31-22.626797.parquet
---
# Dataset Card for Evaluation run of kevinpro/Vicuna-13B-CoT
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kevinpro/Vicuna-13B-CoT
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [kevinpro/Vicuna-13B-CoT](https://huggingface.co/kevinpro/Vicuna-13B-CoT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T13:31:22.626797](https://huggingface.co/datasets/open-llm-leaderboard/details_kevinpro__Vicuna-13B-CoT/blob/main/results_2023-09-17T13-31-22.626797.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146,
"acc": 0.4141695683211732,
"acc_stderr": 0.010019161585538096
},
"harness|drop|3": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.00774004433710381
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972384
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
316usman/thematic4b_rr | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 172219842.20373985
num_examples: 269577
download_size: 62151980
dataset_size: 172219842.20373985
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Askari11/imdb-llama2-1k | ---
dataset_info:
features:
- name: review
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 2686796
num_examples: 1000
download_size: 1741566
dataset_size: 2686796
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
LenguajeNaturalAI/ClinDiagnosES | ---
dataset_info:
features:
- name: caso_clinico
dtype: string
- name: Diagnostico
dtype: string
- name: Especialidad
dtype: string
splits:
- name: train
num_bytes: 47156
num_examples: 62
download_size: 33848
dataset_size: 47156
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-nc-sa-4.0
task_categories:
- question-answering
- text-generation
- text2text-generation
language:
- es
tags:
- medical
- biology
pretty_name: ClinDiagnosES
size_categories:
- n<1K
---
## Introducción
Este corpus se ha construido con ayuda de profesionales del sector de la salud de diversos ámbitos: cardiología, traumatología, urgencias, psiquiatría, neurología, dermatología, otorrino larongología, anestesia.
## Guía de uso
El template para este dataset, con el fin de poder evaluar adecuadamente el rendimiento de LLMs sobre esta tarea, es el siguiente:
```python
prompt_template="""A partir del caso clínico que se expone a continuación, tu tarea es la siguiente.
Como médico experto, tu tarea es la de diagnosticar al paciente en base al caso clínico. Responde únicamente con el diagnóstico para el paciente de forma concisa.
Caso clínico: {caso_clinico}
"""
# cómo usarlo con un LLM:
system_prompt = "Eres un experto en medicina que realiza diagnósticos en base a casos clínicos."
messages = [
{"role": "system", "content": system_prompt},
{"role": "user", "content": prompt_template.format(caso_clinico=caso_clinico)}
]
mssg = tokenizer.apply_chat_template(messages, tokenize=False)
```
## Licencia
Este dataset está distribuido con licencia [CC BY-NC-SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/)
## Atribución del corpus
El corpus ha sido el resultado de una colaboración conjunta de [LenguajeNatural.AI](https://lenguajenatural.ai), [IE University](https://www.ie.edu/university/) y diversos profesionales de la salud.

 |
yeshwanthkesani/llama-train-dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 10352154
num_examples: 6975
download_size: 4860498
dataset_size: 10352154
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_drop_aux_be_gonna | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 250
num_examples: 3
- name: train
num_bytes: 1529
num_examples: 18
download_size: 4798
dataset_size: 1779
---
# Dataset Card for "MULTI_VALUE_cola_drop_aux_be_gonna"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TopicNet/ICD-10 | ---
language:
- ru
multilinguality:
- monolingual
license: other
license_name: topicnet
license_link: >-
https://github.com/machine-intelligence-laboratory/TopicNet/blob/master/LICENSE.txt
task_categories:
- text-classification
task_ids:
- topic-classification
- multi-class-classification
- multi-label-classification
tags:
- topic-modeling
- topic-modelling
- text-clustering
- multimodal-data
- multimodal-learning
- modalities
- document-representation
---
# ICD-10 (МКБ-10)
Some measurable characteristics of the dataset:
* D — number of documents
* <modality name> W — modality dictionary size (number of unique tokens)
* <modality name> len D — average document length in modality tokens (number of tokens)
* <modality name> len D uniq — average document length in unique modality tokens (number of unique tokens)
| | D | @text W | @text len D | @text len D uniq | @letter W | @letter len D | @letter len D uniq |
|:------|------------:|-----------------:|---------------------:|--------------------------:|-------------------:|-----------------------:|----------------------------:|
| value | 1733 | 953168 | 550.01 | 550.01 | 1733 | 1 | 1 |
Information about document lengths in modality tokens:
| | len_total@text | len_total@letter | len_uniq@text | len_uniq@letter |
|:-----|-----------------:|-------------------:|----------------:|------------------:|
| mean | 550.01 | 1 | 550.01 | 1 |
| std | 736.858 | 0 | 736.858 | 0 |
| min | 7 | 1 | 7 | 1 |
| 25% | 148 | 1 | 148 | 1 |
| 50% | 300 | 1 | 300 | 1 |
| 75% | 640 | 1 | 640 | 1 |
| max | 8042 | 1 | 8042 | 1 |
**Metadata**: known class labels (25 classes). See [ICD-10 chapters](https://en.wikipedia.org/wiki/ICD-10#Chapters).
|
celinelee/thestack_py312_fstrings_dpo | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 388757312
num_examples: 100001
download_size: 2737642
dataset_size: 388757312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vaibhavnalawade7/attendence | ---
license: mit
---
|
AKM15/arxiv_chunk | ---
dataset_info:
features:
- name: id
dtype: string
- name: paper_id
dtype: string
- name: text
dtype: string
- name: metadata
struct:
- name: authors
sequence: string
- name: categories
sequence: string
- name: primary_category
dtype: string
- name: published
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 3211839268
num_examples: 3970386
download_size: 1025658407
dataset_size: 3211839268
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
chenle015/OpenMP_Question_Answering | ---
license: bsd
---
# OpenMP Question Answering Dataset
OpenMP Question Answering Dataset is a new OpenMP question answering introduced in paper "LM4HPC: Towards Effective Language Model Application in High-Performance Computing".
It is designed to probe the capabilities of language models in single-turn interactions with users. Similar to other QA datasets, we include
some request-response pairs which are not strictly question-answering pairs. The categories and examples of questions in the OMPQA dataset can be found in below table.
| **Category** | **Count** | **Example Questions** |
| ------------ | --------- | --------------------- |
| Basics | 40 | What is a worksharing construct in OpenMP? |
| Examples | 20 | Give an example OpenMP C code for computing PI using numerical integration. |
| Compilers | 24 | In what language is LLVM written? <br> How is a parallel region represented in Clang? |
| Benchmarks | 23 | What are the NAS Parallel benchmarks? <br> Which benchmark assesses data race detection tools? |
# Data Usage
The dataset is provided in a CSV file, with each entry in the CSV table representing a pair of question and answer.
# Contribute
Welcome to join us and become a contributor to this project! If you want to share some datasets, put them in csv file and email to lechen AT iastate.edu. Thank you!
# Citation
If you use the data collection, code, or experimental findings in this repository, please cite our IWOMP paper:
@article{chen2023lm4hpc,
title={LM4HPC: Towards Effective Language Model Application in High-Performance Computing},
author={Chen, Le and Lin, Pei-Hung and Vanderbruggen, Tristan and Liao, Chunhua and Emani, Murali and de Supinski, Bronis},
journal={arXiv preprint arXiv:2306.14979},
year={2023}
}
|
argilla/ultrafeedback-binarized-preferences | ---
dataset_info:
features:
- name: source
dtype: string
- name: instruction
dtype: string
- name: chosen_response
dtype: string
- name: rejected_response
dtype: string
- name: chosen_avg_rating
dtype: float64
- name: rejected_avg_rating
dtype: float64
- name: chosen_model
dtype: string
splits:
- name: train
num_bytes: 203496687.77711597
num_examples: 63619
download_size: 109861341
dataset_size: 203496687.77711597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Ultrafeedback binarized dataset using the mean of preference ratings
## Introduction
This dataset contains the result of curation work performed by Argilla (using Argilla 😃).
After visually browsing around some examples using the sort and filter feature of Argilla (sort by highest rating for chosen responses), we noticed a strong mismatch between the `overall_score` in the original UF dataset (and the Zephyr train_prefs dataset) and the quality of the chosen response.
By adding the critique rationale to our Argilla Dataset, we confirmed the critique rationale was highly negative, whereas the rating was very high (the highest in fact: `10`). See screenshot below for one example of this issue. After some quick investigation, we identified hundreds of examples having the same issue and a potential bug on the UltraFeedback repo.

For context, [this is the corresponding example](https://huggingface.co/datasets/HuggingFaceH4/ultrafeedback_binarized/viewer/default/train_prefs?row=52108) within the `train_prefs` dataset with a `score_chosen` of `10`.
The dataset is fully open and browsable at https://huggingface.co/spaces/argilla/ultrafeedback-curator (credentials: owner/12345678). Try browsing by discarded or using the sort feature to find problematic records yourself.
## Dataset processing
1. We have identified a buggy behaviour of how `overall_score` was generated in the UltraFeedback dataset using the Critique Model, which caused very low quality (and rated) responses to get a very high score. The reason [is this line](https://github.com/OpenBMB/UltraFeedback/blob/e662fd291e5bdf9103a70c2496dc8f1fbcaefe7b/src/data_annotation/annotate_critique.py#L81) which will give a **`10` to responses that get a `1` from the Critique model**.
2. To **benefit from the preference data of UltraFeedback** (aspect-based preference data: honesty, instruction-following, etc.) and not the **Critique model** (which **evaluates a single response individually**), we have opted for not using `overall_score` and compute the mean of preference ratings instead.
3. We **select the best reponse based on this mean** (named `best_rated_response`), and keep the one based on the overall_score for comparison purposes
4. We **select a random response with lower mean rating** (or equal in the worst case scenario, for preference tuning we'll filter those cases out), named `random_response_for_best_rated`. This follows the method described in the Zephyr paper of picking a random response instead of the lowest rated response. In any case, we keep all completions for people looking at additional approaches. One could binarize the data differently, for example generating several pairs per row based on their ranking (as done on the OpenAI work).
5. We **remove ties**.
Please note that the binarized version from H4 uses `*_best_overall` with scores `[1,10]` range and we use `avg_rating` in the `[1,5]` range.
Based on an initial analysis, using mean rating vs overall_score picks a different chosen response in ~30K examples (out of ~63K).
Additionally, using overall_score results in picking responses from less powerful models more often. See the distribution below:

## Reproduce
Steps:
- Compute mean of preference ratings (honesty, instruction-following, etc.)
- Pick the best mean rating as the chosen
- Pick random rejected with lower mean (or another random if equal to chosen rating)
- Filter out examples with chosen rating == rejected rating
Code for the base dataset preparation (you can use it for using another strategy for binarization):
```python
from typing import List, Dict, Optional, Any
from datasets import load_dataset
import random
# Load the dataset
dataset = load_dataset("openbmb/UltraFeedback", split="train")#test it: .select(range(10))
def calculate_average_rating(annotations: Dict[str, Any]) -> Optional[float]:
ratings = [int(details['Rating']) for details in annotations.values() if 'Rating' in details and details['Rating'] != "N/A"]
return sum(ratings) / len(ratings) if ratings else None
def select_rejected_responses(completions: List[Dict[str, Any]], comparison_key: str, best_score: float) -> Optional[Dict[str, Any]]:
eligible_responses = [resp for resp in completions if resp.get(comparison_key, -1) < best_score and resp.get(comparison_key) is not None]
sorted_eligible_responses = sorted(eligible_responses, key=lambda x: x.get(comparison_key, -1), reverse=True)
return sorted_eligible_responses#random.choice(eligible_responses) if eligible_responses else None
def process_dataset(record: Dict[str, Any]) -> Dict[str, Any]:
completions = record.get('completions', [])
if not completions:
return {**record, 'best_rated_response': None, 'random_response_for_rated': None}
for response in completions:
response['average_rating'] = calculate_average_rating(response.get('annotations', {}))
best_rated_response = max(completions, key=lambda x: x.get('average_rating', -1))
rejected_responses_list = select_rejected_responses(completions, 'average_rating', best_rated_response.get('average_rating', -1))
rejected_ratings = []
rejected_responses = []
rejected_models = []
for rejected in rejected_responses_list:
rejected_ratings.append(rejected['average_rating'])
rejected_responses.append(rejected['response'])
rejected_models.append(rejected['model'])
if len(rejected_responses_list)>=0:
return {
'source': record['source'],
'instruction': record['instruction'],
'chosen_model': best_rated_response.get("model"),
'chosen_rating': best_rated_response.get("average_rating"),
'chosen_response': best_rated_response['response'],
'rejected_responses': rejected_responses,
'rejected_ratings': rejected_ratings,
'completions': completions
}
else:
print(f"No candidates for best with rating: {best_rated_response}")
results = [process_dataset(record) for record in dataset if len(record['completions'])>0]
```
Example record:
```json
{
"source":"flan_v2_niv2",
"instruction":"In this task, you're given a sentence and your task is to generate a simplified sentence that focuses on lexical paraphrasing from the original sentence given in the input.\nQ: many other authors of this period described eastern mediterranean cooking in greek and in latin.\nA: ",
"chosen_model":"mpt-30b-chat",
"chosen_rating":4.75,
"chosen_response":"Other writers of that time wrote about eastern Mediterranean cuisine in both Greek and Latin.",
"rejected_responses":[
"Other authors from this period wrote about Mediterranean cooking in Greek and Latin.",
"It turns out many other authors during that time period described Eastern Mediterranean cooking, both in Greek and Latin language.\n\nIn this task, you started with a sentence that describes a topic using complex vocabulary. Your task was to create a simplified version of that sentence. \n\nYour new sentence uses simpler words and language to express the same idea. It tells us that many other writers from the past wrote about food from Eastern Mediterranean countries using both Greek and Latin languages.",
"The authors of the era wrote extensively about Eastern Mediterranean cuisine in both Greek and Latin.\n\nIn this simplified sentence, I maintained the original phrase \"of this period\" to honor the historical context, while changing \"Greek and in Latin\" to a simpler construction that better conveys the same meaning without sounding repetitive or wordy. I also added a positive tone and made sure the phrase was clear and concise.\nIs there anything else I can help you with?"
],
"rejected_ratings":[
4.5,
3.5,
3.25
]
}
```
|
SargeZT/coco-stuff-captioned-depth | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: guide
dtype: image
- name: caption
dtype: string
splits:
- name: test
num_bytes: 4604976.0
num_examples: 8
- name: train
num_bytes: 4380740801.0
num_examples: 9000
download_size: 4386018461
dataset_size: 4385345777.0
---
# Dataset Card for "coco-stuff-captioned-depth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RafaG/pretrained_v2 | ---
license: openrail
---
|
HydraLM/unnatural-instructions_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 87021232
num_examples: 131934
download_size: 35634034
dataset_size: 87021232
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "unnatural-instructions_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ZHENGRAN/code_ujb_complete | ---
dataset_info:
features:
- name: function
dtype: string
- name: class_signature
dtype: string
- name: prompt_complete_with_comment
dtype: string
- name: import_context
dtype: string
- name: location
dtype: string
- name: function_tested_rate
dtype: float64
- name: class_field_context
dtype: string
- name: end
dtype: int64
- name: function_name
dtype: string
- name: prompt_chat_with_comment
dtype: string
- name: start
dtype: int64
- name: prompt_complete
dtype: string
- name: comment
dtype: string
- name: code_context
dtype: string
- name: bug_id
dtype: int64
- name: source_dir
dtype: string
- name: prompt_chat
dtype: string
- name: class_function_signature_context
dtype: string
- name: task_id
dtype: string
- name: testmethods
sequence: string
- name: function_signature
dtype: string
- name: project
dtype: string
- name: source
dtype: string
- name: indent
dtype: string
splits:
- name: train
num_bytes: 16710431
num_examples: 238
download_size: 3388832
dataset_size: 16710431
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DRXD1000/Mini-C4 | ---
dataset_info:
features:
- name: text
dtype: string
- name: timestamp
dtype: string
- name: url
dtype: string
splits:
- name: train
num_bytes: 820676151.2541025
num_examples: 361703
- name: test
num_bytes: 91188003.74589755
num_examples: 40190
download_size: 563300036
dataset_size: 911864155.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
simlamkr1/train-dataset-sim001 | ---
license: other
---
|
kiran475/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
C-MTEB/T2Reranking_en2zh | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: dev
num_bytes: 206929387
num_examples: 6129
download_size: 120405829
dataset_size: 206929387
---
# Dataset Card for "T2Reranking_en2zh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dnjdsxor21/sample | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
- name: ner_map
sequence: int64
splits:
- name: test
num_bytes: 543182240
num_examples: 75610
- name: validation
num_bytes: 93779936
num_examples: 13054
- name: train3
num_bytes: 194542720
num_examples: 27080
download_size: 82128446
dataset_size: 831504896
---
# Dataset Card for "sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Aleereza/NER_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': O
'1': B-DAT
'2': I-DAT
'3': B-EVE
'4': I-EVE
'5': B-LOC
'6': I-LOC
'7': B-ORG
'8': I-ORG
'9': B-PER
'10': I-PER
splits:
- name: train
num_bytes: 453592585.5859551
num_examples: 22521473
- name: test
num_bytes: 25199589.207022466
num_examples: 1251193
- name: val
num_bytes: 25199589.207022466
num_examples: 1251193
download_size: 185174999
dataset_size: 503991764.00000006
---
# Dataset Card for "NER_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anyspeech/ucla_phonetic_corpus | ---
dataset_info:
features:
- name: filename
dtype: string
- name: phones
dtype: string
- name: audio
struct:
- name: array
sequence: float32
- name: sampling_rate
dtype: int64
splits:
- name: eus
num_bytes: 3108551
num_examples: 47
- name: kub
num_bytes: 1715709
num_examples: 29
- name: abk
num_bytes: 4403000
num_examples: 54
- name: ace
num_bytes: 2704786
num_examples: 39
- name: ady
num_bytes: 10482658
num_examples: 124
- name: aeb
num_bytes: 2833699
num_examples: 43
- name: afn
num_bytes: 4851569
num_examples: 85
- name: afr
num_bytes: 6692077
num_examples: 124
- name: agx
num_bytes: 5937667
num_examples: 75
- name: ajp
num_bytes: 3582911
num_examples: 51
- name: aka
num_bytes: 2255575
num_examples: 40
- name: apc
num_bytes: 11257587
num_examples: 157
- name: ape
num_bytes: 4480181
num_examples: 70
- name: apw
num_bytes: 4576388
num_examples: 62
- name: asm
num_bytes: 6262493
num_examples: 86
- name: azb
num_bytes: 4725581
num_examples: 60
- name: bam
num_bytes: 4344032
num_examples: 69
- name: bem
num_bytes: 1838480
num_examples: 26
- name: ben
num_bytes: 2484081
num_examples: 40
- name: bfd
num_bytes: 1792407
num_examples: 24
- name: bfq
num_bytes: 2312935
num_examples: 34
- name: bhk
num_bytes: 2261168
num_examples: 33
- name: bin
num_bytes: 1596474
num_examples: 24
- name: brv
num_bytes: 2927768
num_examples: 45
- name: bsq
num_bytes: 1237379
num_examples: 24
- name: bwr
num_bytes: 2562919
num_examples: 41
- name: cbv
num_bytes: 4163303
num_examples: 63
- name: ces
num_bytes: 2866267
num_examples: 42
- name: cha
num_bytes: 1527287
num_examples: 24
- name: cji
num_bytes: 3050715
num_examples: 45
- name: col
num_bytes: 4068720
num_examples: 46
- name: cpn
num_bytes: 3932592
num_examples: 63
- name: dag
num_bytes: 1617536
num_examples: 23
- name: dan
num_bytes: 5385298
num_examples: 87
- name: deg
num_bytes: 2555446
num_examples: 39
- name: dyo
num_bytes: 2136186
num_examples: 31
- name: efi
num_bytes: 3350397
num_examples: 49
- name: ell
num_bytes: 3481047
num_examples: 51
- name: ema
num_bytes: 1713575
num_examples: 23
- name: ewe
num_bytes: 2530156
num_examples: 38
- name: ffm
num_bytes: 2261106
num_examples: 31
- name: fin
num_bytes: 6433992
num_examples: 107
- name: fub
num_bytes: 1490759
num_examples: 23
- name: gaa
num_bytes: 1750241
num_examples: 28
- name: gla
num_bytes: 1669576
num_examples: 27
- name: guj
num_bytes: 3936456
num_examples: 60
- name: gwx
num_bytes: 1387208
num_examples: 22
- name: hak
num_bytes: 2480163
num_examples: 40
- name: hau
num_bytes: 3942393
num_examples: 62
- name: haw
num_bytes: 3254444
num_examples: 54
- name: heb
num_bytes: 3544505
num_examples: 53
- name: hil
num_bytes: 3170052
num_examples: 51
- name: hin
num_bytes: 5300326
num_examples: 77
- name: hni
num_bytes: 1427423
num_examples: 22
- name: hrv
num_bytes: 4676073
num_examples: 74
- name: hun
num_bytes: 7922854
num_examples: 124
- name: hye
num_bytes: 6344958
num_examples: 81
- name: ibb
num_bytes: 4057572
num_examples: 63
- name: ibo
num_bytes: 3148749
num_examples: 48
- name: idu
num_bytes: 3304523
num_examples: 48
- name: ilo
num_bytes: 7581817
num_examples: 90
- name: isl
num_bytes: 9679083
num_examples: 162
- name: its
num_bytes: 1629008
num_examples: 22
- name: kan
num_bytes: 5438898
num_examples: 86
- name: kea
num_bytes: 3227702
num_examples: 54
- name: khm
num_bytes: 4098080
num_examples: 70
- name: klu
num_bytes: 4025430
num_examples: 75
- name: knn
num_bytes: 4568917
num_examples: 82
- name: kri
num_bytes: 1162442
num_examples: 22
- name: kye
num_bytes: 1319998
num_examples: 23
- name: lad
num_bytes: 3550365
num_examples: 59
- name: lar
num_bytes: 1452546
num_examples: 25
- name: lav
num_bytes: 4733523
num_examples: 68
- name: led
num_bytes: 1327549
num_examples: 23
- name: lgq
num_bytes: 1513947
num_examples: 24
- name: lit
num_bytes: 10973034
num_examples: 134
- name: lkt
num_bytes: 2718478
num_examples: 42
- name: lug
num_bytes: 5087192
num_examples: 67
- name: mak
num_bytes: 3951387
num_examples: 49
- name: mal
num_bytes: 1484963
num_examples: 20
- name: mlt
num_bytes: 6205176
num_examples: 93
- name: mya
num_bytes: 6734121
num_examples: 116
- name: nan
num_bytes: 4714799
num_examples: 76
- name: njm
num_bytes: 2034534
num_examples: 34
- name: nld
num_bytes: 5826824
num_examples: 91
- name: ozm
num_bytes: 1974820
num_examples: 27
- name: pam
num_bytes: 4014947
num_examples: 57
- name: pes
num_bytes: 10911547
num_examples: 156
- name: prs
num_bytes: 7895016
num_examples: 103
- name: run
num_bytes: 3540544
num_examples: 46
- name: sbc
num_bytes: 1778804
num_examples: 23
- name: tsw
num_bytes: 1913455
num_examples: 27
- name: tzm
num_bytes: 2457176
num_examples: 40
- name: wuu
num_bytes: 3631436
num_examples: 71
- name: yue
num_bytes: 7815231
num_examples: 127
download_size: 427484194
dataset_size: 368082762
---
# Dataset Card for "ucla_phonetic_corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SummerJingyun/guanaco-llama2-3.5k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5501405
num_examples: 3500
download_size: 3257474
dataset_size: 5501405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/shigure_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shigure/時雨 (Kantai Collection)
This is the dataset of shigure/時雨 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `blue_eyes, ahoge, braid, long_hair, single_braid, hair_ornament, hair_over_shoulder, brown_hair, black_hair, hair_between_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 570.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 352.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1228 | 747.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 513.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1228 | 1007.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shigure_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shigure_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, blush, hair_flaps, looking_at_viewer, navel, solo, medium_breasts, cleavage, underwear_only, black_bra, black_panties, collarbone, bow, cowboy_shot, on_back, smile |
| 1 | 31 |  |  |  |  |  | 1girl, hair_flaps, solo, looking_at_viewer, black_bikini, blush, medium_breasts, cleavage, navel, smile, sailor_bikini, adapted_costume, collarbone |
| 2 | 5 |  |  |  |  |  | 1girl, black_gloves, black_serafuku, fingerless_gloves, hair_flaps, simple_background, solo, white_background, looking_at_viewer, smile, upper_body, red_neckerchief, necktie, open_mouth |
| 3 | 8 |  |  |  |  |  | 1girl, black_serafuku, hair_flaps, solo, upper_body, looking_at_viewer, red_neckerchief, simple_background, white_background, white_sailor_collar, short_sleeves, smile |
| 4 | 20 |  |  |  |  |  | 1girl, black_serafuku, black_skirt, hair_flaps, pleated_skirt, red_neckerchief, solo, black_gloves, white_background, fingerless_gloves, simple_background, smile, looking_at_viewer, cowboy_shot, red_necktie, short_sleeves, white_sailor_collar, black_shirt, blush |
| 5 | 15 |  |  |  |  |  | 1girl, serafuku, solo, pleated_skirt, looking_at_viewer, fingerless_gloves, hair_flaps, blush |
| 6 | 7 |  |  |  |  |  | 1girl, hair_flaps, portrait, simple_background, solo, white_background, looking_at_viewer, grey_background, open_mouth |
| 7 | 16 |  |  |  |  |  | 1girl, hair_flaps, long_sleeves, official_alternate_costume, sleeveless_shirt, solo, white_shirt, looking_at_viewer, off-shoulder_shirt, black_shirt, blush, smile, blue_skirt, bare_shoulders, red_ribbon, simple_background, bangs, black_thighhighs, turtleneck, medium_breasts, open_mouth, white_background, bag, denim_skirt |
| 8 | 8 |  |  |  |  |  | 1girl, hair_flaps, solo, bridal_veil, looking_at_viewer, smile, wedding_dress, bare_shoulders, blush, white_dress, breasts, hair_flower, wedding_ring, bouquet, elbow_gloves, hair_ribbon, petals, upper_body, white_gloves |
| 9 | 18 |  |  |  |  |  | 1girl, solo, hair_flaps, alternate_costume, smile, blush, looking_at_viewer, wide_sleeves, holding, obi, floral_print, long_sleeves, open_mouth, upper_body, hair_flower, yukata |
| 10 | 9 |  |  |  |  |  | 1girl, solo, alternate_costume, day, outdoors, white_dress, blush, hair_flaps, looking_at_viewer, smile, straw_hat, sun_hat, cloud, flower, sundress, blue_sky, sleeveless_dress, bare_shoulders, hair_ribbon, hand_on_headwear, jewelry, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | hair_flaps | looking_at_viewer | navel | solo | medium_breasts | cleavage | underwear_only | black_bra | black_panties | collarbone | bow | cowboy_shot | on_back | smile | black_bikini | sailor_bikini | adapted_costume | black_gloves | black_serafuku | fingerless_gloves | simple_background | white_background | upper_body | red_neckerchief | necktie | open_mouth | white_sailor_collar | short_sleeves | black_skirt | pleated_skirt | red_necktie | black_shirt | serafuku | portrait | grey_background | long_sleeves | official_alternate_costume | sleeveless_shirt | white_shirt | off-shoulder_shirt | blue_skirt | bare_shoulders | red_ribbon | bangs | black_thighhighs | turtleneck | bag | denim_skirt | bridal_veil | wedding_dress | white_dress | breasts | hair_flower | wedding_ring | bouquet | elbow_gloves | hair_ribbon | petals | white_gloves | alternate_costume | wide_sleeves | holding | obi | floral_print | yukata | day | outdoors | straw_hat | sun_hat | cloud | flower | sundress | blue_sky | sleeveless_dress | hand_on_headwear | jewelry |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-------------|:--------------------|:--------|:-------|:-----------------|:-----------|:-----------------|:------------|:----------------|:-------------|:------|:--------------|:----------|:--------|:---------------|:----------------|:------------------|:---------------|:-----------------|:--------------------|:--------------------|:-------------------|:-------------|:------------------|:----------|:-------------|:----------------------|:----------------|:--------------|:----------------|:--------------|:--------------|:-----------|:-----------|:------------------|:---------------|:-----------------------------|:-------------------|:--------------|:---------------------|:-------------|:-----------------|:-------------|:--------|:-------------------|:-------------|:------|:--------------|:--------------|:----------------|:--------------|:----------|:--------------|:---------------|:----------|:---------------|:--------------|:---------|:---------------|:--------------------|:---------------|:----------|:------|:---------------|:---------|:------|:-----------|:------------|:----------|:--------|:---------|:-----------|:-----------|:-------------------|:-------------------|:----------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 31 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | X | X | | X | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | X | X | | X | | | | | | | | | | X | | | | | X | | X | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 20 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | X | | X | | | | X | X | X | X | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 15 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | | X | X | | X | | | | | | | | | | | | | | | | | X | X | | | | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 16 |  |  |  |  |  | X | X | X | X | | X | X | | | | | | | | | X | | | | | | | X | X | | | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 9 | 18 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | X | | | | | | | | | X | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | X | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
surrey-nlp/SAD | ---
annotations_creators:
- Jordan Painter, Diptesh Kanojia
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
pretty_name: 'Utilising Weak Supervision to create S3D: A Sarcasm Annotated Dataset'
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- text-classification
---
# Utilising Weak Supervision to Create S3D: A Sarcasm Annotated Dataset
This is the repository for the S3D dataset published at EMNLP 2022. The dataset can help build sarcasm detection models.
# SAD
The SAD dataset is our gold standard dataset of tweets labelled for sarcasm. These tweets were scraped by observing a '#sarcasm' hashtag and then manually annotated by three annotators.
There are a total of 1170 pairs of a sarcastic and non-sarcastic tweets which were both posted by the same user, resulting in a total of 2340 tweets annotated for sarcasm.
These tweets can be accessed by using the Twitter API so that they can be used for other experiments.
# Data Fields
- Tweet ID: The ID of the labelled tweet
- Label: A label to denote if a given tweet is sarcastic
# Data Splits
- Train: 1638
- Valid: 351
- Test: 351 |
open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b | ---
pretty_name: Evaluation run of h2oai/h2ogpt-oasst1-512-12b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-oasst1-512-12b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T18:51:34.628441](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b/blob/main/results_2023-10-16T18-51-34.628441.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n\
\ \"em_stderr\": 0.000331458146521924,\n \"f1\": 0.049975880872483294,\n\
\ \"f1_stderr\": 0.0012253223818797603,\n \"acc\": 0.339436730966812,\n\
\ \"acc_stderr\": 0.00841008969581668\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.000331458146521924,\n\
\ \"f1\": 0.049975880872483294,\n \"f1_stderr\": 0.0012253223818797603\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \
\ \"acc_stderr\": 0.0035275958887224733\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6621941594317285,\n \"acc_stderr\": 0.013292583502910887\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T18_51_34.628441
path:
- '**/details_harness|drop|3_2023-10-16T18-51-34.628441.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T18-51-34.628441.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T18_51_34.628441
path:
- '**/details_harness|gsm8k|5_2023-10-16T18-51-34.628441.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T18-51-34.628441.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:11:10.994515.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:11:10.994515.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T18_51_34.628441
path:
- '**/details_harness|winogrande|5_2023-10-16T18-51-34.628441.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T18-51-34.628441.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_11_10.994515
path:
- results_2023-07-19T18:11:10.994515.parquet
- split: 2023_10_16T18_51_34.628441
path:
- results_2023-10-16T18-51-34.628441.parquet
- split: latest
path:
- results_2023-10-16T18-51-34.628441.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-oasst1-512-12b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-oasst1-512-12b](https://huggingface.co/h2oai/h2ogpt-oasst1-512-12b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T18:51:34.628441](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-oasst1-512-12b/blob/main/results_2023-10-16T18-51-34.628441.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.000331458146521924,
"f1": 0.049975880872483294,
"f1_stderr": 0.0012253223818797603,
"acc": 0.339436730966812,
"acc_stderr": 0.00841008969581668
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.000331458146521924,
"f1": 0.049975880872483294,
"f1_stderr": 0.0012253223818797603
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224733
},
"harness|winogrande|5": {
"acc": 0.6621941594317285,
"acc_stderr": 0.013292583502910887
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zaanind/sinhala_englsih_parrel_corpus | ---
language:
- si
- en
license: gpl
size_categories:
- 10K<n<100K
task_categories:
- translation
pretty_name: Zoom Eng-Si Nmt Dataset
dataset_info:
features:
- name: english
dtype: string
- name: sinhala
dtype: string
splits:
- name: train
num_bytes: 8516909
num_examples: 80684
download_size: 4162589
dataset_size: 8516909
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Follow me on : https://facebook.com/zaanind | https://github.com/zaanind
Contact : zaanind@gmail.com | https://m.me/zaanind | https://t.me/zaanind
Dataset Name: Eng-Sinhala Translation Dataset
Description: This dataset contains approximately 80,000 lines of English-Sinhala translation pairs. It can be used to train models for machine translation tasks and other natural language processing applications.
Data License: GPL (GNU General Public License). Please ensure that you comply with the terms and conditions of the GPL when using the dataset.
Note: While you mentioned that some sentences in the dataset might be incorrect due to its large size, it is important to ensure the quality and accuracy of the data for training purposes. Consider performing data cleaning and validation to improve the reliability of your model.
Mission
Our mission is to improve the quality of open-source English to Sinhala machine translation. This dataset, consisting of 8,000 translation pairs, is a step in that direction.
Special Thanks:
We extend our gratitude to the data collected and cleared by the Zoom.lk subtitles team, whose contributions have been invaluable in making this dataset possible.
Please feel free to reach out if you have any questions, suggestions, or would like to collaborate on further improving this dataset or machine translation models. Your support is greatly appreciated!
(Contact : zaanind@gmail.com | https://m.me/zaanind | https://t.me/zaanind) |
thinkall/2WikiMultihopQA | ---
license: apache-2.0
---
Updated on https://huggingface.co/datasets/voidful/2WikiMultihopQA/blob/main/dev.json with modifications. |
open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-7b | ---
pretty_name: Evaluation run of Azure99/blossom-v4-qwen1_5-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azure99/blossom-v4-qwen1_5-7b](https://huggingface.co/Azure99/blossom-v4-qwen1_5-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T16:00:32.366678](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-7b/blob/main/results_2024-02-19T16-00-32.366678.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6032059623783226,\n\
\ \"acc_stderr\": 0.033209915441181244,\n \"acc_norm\": 0.6059048061191813,\n\
\ \"acc_norm_stderr\": 0.03387614332381002,\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5368612367135771,\n\
\ \"mc2_stderr\": 0.01556334943663444\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5068259385665529,\n \"acc_stderr\": 0.014610029151379813,\n\
\ \"acc_norm\": 0.5443686006825939,\n \"acc_norm_stderr\": 0.014553749939306866\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5660227046405099,\n\
\ \"acc_stderr\": 0.004946089230153021,\n \"acc_norm\": 0.7611033658633738,\n\
\ \"acc_norm_stderr\": 0.00425538005001511\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715563,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.0402873153294756,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.0402873153294756\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5026455026455027,\n \"acc_stderr\": 0.025750949678130387,\n \"\
acc_norm\": 0.5026455026455027,\n \"acc_norm_stderr\": 0.025750949678130387\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n\
\ \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n\
\ \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.772020725388601,\n \"acc_stderr\": 0.030276909945178253,\n\
\ \"acc_norm\": 0.772020725388601,\n \"acc_norm_stderr\": 0.030276909945178253\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217923,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217923\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703642,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703642\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6946564885496184,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.6946564885496184,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028704,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335835,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335835\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153183,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.015961036675230966,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.015961036675230966\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015684,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015684\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6334405144694534,\n\
\ \"acc_stderr\": 0.027368078243971625,\n \"acc_norm\": 0.6334405144694534,\n\
\ \"acc_norm_stderr\": 0.027368078243971625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778852,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778852\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42503259452411996,\n\
\ \"acc_stderr\": 0.012625879884892001,\n \"acc_norm\": 0.42503259452411996,\n\
\ \"acc_norm_stderr\": 0.012625879884892001\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.03032024326500413,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.03032024326500413\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5833333333333334,\n \"acc_stderr\": 0.019944914136873583,\n \
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.019944914136873583\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304335,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304335\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348387,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348387\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n\
\ \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5368612367135771,\n\
\ \"mc2_stderr\": 0.01556334943663444\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.712707182320442,\n \"acc_stderr\": 0.012717481052478028\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5670962850644428,\n \
\ \"acc_stderr\": 0.013647916362576057\n }\n}\n```"
repo_url: https://huggingface.co/Azure99/blossom-v4-qwen1_5-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|arc:challenge|25_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|gsm8k|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hellaswag|10_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-00-32.366678.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T16-00-32.366678.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- '**/details_harness|winogrande|5_2024-02-19T16-00-32.366678.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T16-00-32.366678.parquet'
- config_name: results
data_files:
- split: 2024_02_19T16_00_32.366678
path:
- results_2024-02-19T16-00-32.366678.parquet
- split: latest
path:
- results_2024-02-19T16-00-32.366678.parquet
---
# Dataset Card for Evaluation run of Azure99/blossom-v4-qwen1_5-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azure99/blossom-v4-qwen1_5-7b](https://huggingface.co/Azure99/blossom-v4-qwen1_5-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T16:00:32.366678](https://huggingface.co/datasets/open-llm-leaderboard/details_Azure99__blossom-v4-qwen1_5-7b/blob/main/results_2024-02-19T16-00-32.366678.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6032059623783226,
"acc_stderr": 0.033209915441181244,
"acc_norm": 0.6059048061191813,
"acc_norm_stderr": 0.03387614332381002,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5368612367135771,
"mc2_stderr": 0.01556334943663444
},
"harness|arc:challenge|25": {
"acc": 0.5068259385665529,
"acc_stderr": 0.014610029151379813,
"acc_norm": 0.5443686006825939,
"acc_norm_stderr": 0.014553749939306866
},
"harness|hellaswag|10": {
"acc": 0.5660227046405099,
"acc_stderr": 0.004946089230153021,
"acc_norm": 0.7611033658633738,
"acc_norm_stderr": 0.00425538005001511
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.0402873153294756,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.0402873153294756
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5026455026455027,
"acc_stderr": 0.025750949678130387,
"acc_norm": 0.5026455026455027,
"acc_norm_stderr": 0.025750949678130387
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.772020725388601,
"acc_stderr": 0.030276909945178253,
"acc_norm": 0.772020725388601,
"acc_norm_stderr": 0.030276909945178253
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217923,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217923
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703642,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703642
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6946564885496184,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.6946564885496184,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028704,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335835,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335835
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153183,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.015961036675230966,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.015961036675230966
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015684,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015684
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6334405144694534,
"acc_stderr": 0.027368078243971625,
"acc_norm": 0.6334405144694534,
"acc_norm_stderr": 0.027368078243971625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778852,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778852
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42503259452411996,
"acc_stderr": 0.012625879884892001,
"acc_norm": 0.42503259452411996,
"acc_norm_stderr": 0.012625879884892001
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.03032024326500413,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.03032024326500413
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.019944914136873583,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.019944914136873583
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304335,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304335
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348387,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348387
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5368612367135771,
"mc2_stderr": 0.01556334943663444
},
"harness|winogrande|5": {
"acc": 0.712707182320442,
"acc_stderr": 0.012717481052478028
},
"harness|gsm8k|5": {
"acc": 0.5670962850644428,
"acc_stderr": 0.013647916362576057
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Abirate/english_quotes | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
- crowdsourced
language:
- en
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-label-classification
---
# ****Dataset Card for English quotes****
# **I-Dataset Summary**
english_quotes is a dataset of all the quotes retrieved from [goodreads quotes](https://www.goodreads.com/quotes). This dataset can be used for multi-label text classification and text generation. The content of each quote is in English and concerns the domain of datasets for NLP and beyond.
# **II-Supported Tasks and Leaderboards**
- Multi-label text classification : The dataset can be used to train a model for text-classification, which consists of classifying quotes by author as well as by topic (using tags). Success on this task is typically measured by achieving a high or low accuracy.
- Text-generation : The dataset can be used to train a model to generate quotes by fine-tuning an existing pretrained model on the corpus composed of all quotes (or quotes by author).
# **III-Languages**
The texts in the dataset are in English (en).
# **IV-Dataset Structure**
#### Data Instances
A JSON-formatted example of a typical instance in the dataset:
```python
{'author': 'Ralph Waldo Emerson',
'quote': '“To be yourself in a world that is constantly trying to make you something else is the greatest accomplishment.”',
'tags': ['accomplishment', 'be-yourself', 'conformity', 'individuality']}
```
#### Data Fields
- **author** : The author of the quote.
- **quote** : The text of the quote.
- **tags**: The tags could be characterized as topics around the quote.
#### Data Splits
I kept the dataset as one block (train), so it can be shuffled and split by users later using methods of the hugging face dataset library like the (.train_test_split()) method.
# **V-Dataset Creation**
#### Curation Rationale
I want to share my datasets (created by web scraping and additional cleaning treatments) with the HuggingFace community so that they can use them in NLP tasks to advance artificial intelligence.
#### Source Data
The source of Data is [goodreads](https://www.goodreads.com/?ref=nav_home) site: from [goodreads quotes](https://www.goodreads.com/quotes)
#### Initial Data Collection and Normalization
The data collection process is web scraping using BeautifulSoup and Requests libraries.
The data is slightly modified after the web scraping: removing all quotes with "None" tags, and the tag "attributed-no-source" is removed from all tags, because it has not added value to the topic of the quote.
#### Who are the source Data producers ?
The data is machine-generated (using web scraping) and subjected to human additional treatment.
below, I provide the script I created to scrape the data (as well as my additional treatment):
```python
import requests
from bs4 import BeautifulSoup
import pandas as pd
import json
from collections import OrderedDict
page = requests.get('https://www.goodreads.com/quotes')
if page.status_code == 200:
pageParsed = BeautifulSoup(page.content, 'html5lib')
# Define a function that retrieves information about each HTML quote code in a dictionary form.
def extract_data_quote(quote_html):
quote = quote_html.find('div',{'class':'quoteText'}).get_text().strip().split('\n')[0]
author = quote_html.find('span',{'class':'authorOrTitle'}).get_text().strip()
if quote_html.find('div',{'class':'greyText smallText left'}) is not None:
tags_list = [tag.get_text() for tag in quote_html.find('div',{'class':'greyText smallText left'}).find_all('a')]
tags = list(OrderedDict.fromkeys(tags_list))
if 'attributed-no-source' in tags:
tags.remove('attributed-no-source')
else:
tags = None
data = {'quote':quote, 'author':author, 'tags':tags}
return data
# Define a function that retrieves all the quotes on a single page.
def get_quotes_data(page_url):
page = requests.get(page_url)
if page.status_code == 200:
pageParsed = BeautifulSoup(page.content, 'html5lib')
quotes_html_page = pageParsed.find_all('div',{'class':'quoteDetails'})
return [extract_data_quote(quote_html) for quote_html in quotes_html_page]
# Retrieve data from the first page.
data = get_quotes_data('https://www.goodreads.com/quotes')
# Retrieve data from all pages.
for i in range(2,101):
print(i)
url = f'https://www.goodreads.com/quotes?page={i}'
data_current_page = get_quotes_data(url)
if data_current_page is None:
continue
data = data + data_current_page
data_df = pd.DataFrame.from_dict(data)
for i, row in data_df.iterrows():
if row['tags'] is None:
data_df = data_df.drop(i)
# Produce the data in a JSON format.
data_df.to_json('C:/Users/Abir/Desktop/quotes.jsonl',orient="records", lines =True,force_ascii=False)
# Then I used the familiar process to push it to the Hugging Face hub.
```
#### Annotations
Annotations are part of the initial data collection (see the script above).
# **VI-Additional Informations**
#### Dataset Curators
Abir ELTAIEF
#### Licensing Information
This work is licensed under a Creative Commons Attribution 4.0 International License (all software and libraries used for web scraping are made available under this Creative Commons Attribution license).
#### Contributions
Thanks to [@Abirate](https://huggingface.co/Abirate)
for adding this dataset. |
Luciano/victor_lrec_2020_small | ---
dataset_info:
features:
- name: themes
dtype: string
- name: process_id
dtype: string
- name: file_name
dtype: string
- name: document_type
dtype: string
- name: pages
dtype: int64
- name: body
dtype: string
splits:
- name: train
num_bytes: 219095295
num_examples: 149217
- name: validation
num_bytes: 139364099
num_examples: 94735
- name: test
num_bytes: 140516696
num_examples: 95526
download_size: 154894289
dataset_size: 498976090
---
# Dataset Card for "victor_lrec_2020_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SunRise228/business-doc | ---
language:
- en
--- |
cringgaard/boats_dataset | ---
task_categories:
- image-classification
language:
- en
tags:
- FGVC
pretty_name: BOATS
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ccpp/test1 | ---
license: afl-3.0
---
|
yudiwbs/eli5_id-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 821834
num_examples: 1000
download_size: 458403
dataset_size: 821834
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "eli5_id-llama2-1k"
1000 data dengan format LLAMA2 yang dapat digunakan untuk finetune.
Sumber: https://huggingface.co/datasets/indonesian-nlp/eli5_id/
|
sanaeai/atsad1 | ---
dataset_info:
features:
- name: tweet
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 16169142
num_examples: 124133
download_size: 8253585
dataset_size: 16169142
---
# Dataset Card for "atsad1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jordanfan/congress_117_bills_test_bart_summaries_billsum | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: id
dtype: string
- name: policy_areas
dtype: string
- name: cur_summary
dtype: string
- name: cur_text
dtype: string
- name: title
dtype: string
- name: titles_official
dtype: string
- name: titles_short
dtype: string
- name: sponsor_name
dtype: string
- name: sponsor_party
dtype: string
- name: sponsor_state
dtype: string
- name: cleaned_summary
dtype: string
- name: extracted_text
dtype: string
- name: extracted_text_375
dtype: string
- name: extracted_text_750
dtype: string
- name: extracted_text_1000
dtype: string
- name: bertsum_extracted_250
dtype: string
- name: bertsum_extracted_375
dtype: string
- name: bertsum_extracted_375_1000
dtype: string
- name: bertsum_extracted_250_1000
dtype: string
- name: bertsum_extracted_375_750
dtype: string
- name: bertsum_extracted_250_750
dtype: string
- name: bertsum_extracted_375_500
dtype: string
- name: bertsum_extracted_250_500
dtype: string
- name: bertsum_extracted_375_375
dtype: string
- name: bertsum_extracted_250_375
dtype: string
- name: text_len
dtype: int64
- name: billsum_abstracted_1000
dtype: string
- name: billsum_abstracted_500
dtype: string
- name: __index_level_0__
dtype: int64
- name: summary_billsum_abstracted_1000
dtype: string
splits:
- name: test
num_bytes: 15719352
num_examples: 367
download_size: 7121495
dataset_size: 15719352
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
chikino/deadpoolvoz | ---
license: openrail
---
|
CVasNLPExperiments/TinyImagenet_2k_validation_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_2000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices
num_bytes: 839196
num_examples: 2000
download_size: 216859
dataset_size: 839196
---
# Dataset Card for "TinyImagenet_2k_validation_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_2000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MatsRooth/have_one | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: label
dtype:
class_label:
names:
'0': I+have+one+now
'1': I+only+have+one
splits:
- name: train
num_bytes: 10168367.5
num_examples: 535
- name: test
num_bytes: 1499291.5
num_examples: 95
- name: validation
num_bytes: 1720511.5
num_examples: 97
download_size: 13330229
dataset_size: 13388170.5
---
# Dataset Card for "have_one"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PleIAs/US-PD-Newspapers | ---
license: cc0-1.0
task_categories:
- text-generation
language:
- en
tags:
- ocr
pretty_name: United States-Public Domain-Newspapers
---
# 🇺🇸 US Public Domain Newspapers 🇺🇸
**US-PD-Newspapers** is an agregation of all the archives of US newspapers digitized by the Library of Congress for the Chronicling America digital library.
With nearly 100 billion words, it is one of the largest open corpus in the United States. All the materials are now part of the public domain and have no intellectual property rights remaining.
## Content
As of January 2024, the collection contains nearly 21 millions unique newspaper and periodical editions published from the 1690 to 1963 (98,742,987,471 words).
The collection was compiled by Pierre-Carl Langlais based on the [dumps](https://chroniclingamerica.loc.gov/data/ocr/) made available by the Library of Congress. Each parquet file matches one of the 2618 original dump files, including their code name. It has the full text of a few thousand selected at random and a few core metadatas (edition id, date, word counts…). The metadata can be easily expanded thanks to the LOC APIs and other data services.
The [American Stories dataset](https://huggingface.co/datasets/dell-research-harvard/AmericanStories) is a curated and enhanced version of the same resource, with significant progress in regards to text quality and documentation. It currently retains about 20% of the original material.
## Language
While most of the collection is in English, it also covers a wider variety of European languages, especially German (600k editions) and Spanish (400k editions).
## Uses
The primary use of the collection is for cultural analytics on a wide scale. It has been instrumental for some major digital humanities projects like [Viral Texts](https://viraltexts.org/).
The collection also aims to expand the availability of open works for the training of Large Language Models. The text can be used for model training and republished without restriction for reproducibility purposes.
## License
The composition of the dataset adheres to the US criteria for public domain (any publication without a copyright removal). In agreement with the shorter term rules, the dataset is in the public domain for all countries with a Berne author-right model.
The Library of Congress does not claim any additional rights: "As a publicly supported institution, we generally do not own the rights to materials in our collections. You should determine for yourself whether or not an item is protected by copyright or in the public domain, and then satisfy any copyright or use restrictions when publishing or distributing materials from our collections."
## Future developments
This dataset is not a one time work but will continue to evolve significantly on several directions:
* Correction of computer generated errors in the text. All the texts have been transcribed automatically through the use of Optical Character Recognition (OCR) software. The original files have been digitized over a long time period (since the mid-2000s).
* Enhancement of the structure/editorial presentation of the original text. Some parts of the original documents are likely unwanted for large scale analysis or model training (header, page count…). Additionally, some advanced document structures like tables or multi-column layout are unlikely to be well formatted. Major enhancements could be experted through applying new SOTA layout recognition models on the original PDF files.
* Expansion of the collection to other cultural heritage holdings, especially coming from Hathi Trust, Internet Archive and Google Books.
The American Stories dataset already include some of theses features (especially better OCR and article-level segmentation) and may be a preferable solution if text quality is a concern.
## Acknowledgements
The corpus was stored and processed with the generous support of [OpenLLM France](https://www.openllm-france.fr/) and Scaleway. It was built up with the support and concerted efforts of the state start-up LANGU:IA (start-up d’Etat), supported by the French Ministry of Culture and DINUM, as part of the prefiguration of the service offering of the Alliance for Language technologies EDIC (ALT-EDIC).
Corpus collection has been largely facilitated thanks to the open science LLM community insights and cooperation (Occiglot, Eleuther AI, Allen AI).
<div style="text-align: center;">
<img src="https://github.com/mch-dd/datasetlogo/blob/main/scaleway.jpeg?raw=true" style="width: 33%; margin: 0 auto; display: inline-block;"/>
<img src="https://github.com/mch-dd/datasetlogo/blob/main/ministere.png?raw=true" style="width: 33%; margin: 0 auto; display: inline-block;"/>
<img src="https://github.com/mch-dd/datasetlogo/blob/main/occiglot.jpg?raw=true" style="width: 33%; margin: 0 auto; display: inline-block;"/>
</div>
|
sam-mosaic/iv4-no-fan | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 2284716246.0
num_examples: 313449
- name: test
num_bytes: 316832414.0
num_examples: 36655
download_size: 1312236484
dataset_size: 2601548660.0
---
# Dataset Card for "iv4-no-fan"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WillHeld/wmt19-valid-only-gu_en | ---
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- gu
- en
splits:
- name: validation
num_bytes: 774621
num_examples: 1998
download_size: 367288
dataset_size: 774621
---
# Dataset Card for "wmt19-valid-only-gu_en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_aint_be | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 13296
num_examples: 64
- name: test
num_bytes: 7581
num_examples: 56
- name: train
num_bytes: 17217
num_examples: 105
download_size: 33788
dataset_size: 38094
---
# Dataset Card for "MULTI_VALUE_stsb_aint_be"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_stsb_em_obj_pronoun | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 9534
num_examples: 47
- name: test
num_bytes: 9464
num_examples: 66
- name: train
num_bytes: 13110
num_examples: 75
download_size: 29885
dataset_size: 32108
---
# Dataset Card for "MULTI_VALUE_stsb_em_obj_pronoun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/Caltech101_with_background_test_google_flan_t5_xxl_mode_A_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 40109
num_examples: 100
download_size: 11527
dataset_size: 40109
---
# Dataset Card for "Caltech101_with_background_test_google_flan_t5_xxl_mode_A_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thomasavare/waste-classification-audio-helsinki2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: speaker
dtype: string
- name: transcription
dtype: string
- name: translation
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: train
num_bytes: 190035689.0
num_examples: 500
download_size: 190018067
dataset_size: 190035689.0
---
# Dataset Card for "waste-classification-audio-helsinki2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
trtd56/practical_nlp_course_4_dset | ---
dataset_info:
features:
- name: qid
dtype: string
- name: question
dtype: string
- name: answer_candidates
sequence: string
- name: correct_answer_index
dtype: int64
- name: input_ids
sequence:
sequence: int32
- name: token_type_ids
sequence:
sequence: int8
- name: attention_mask
sequence:
sequence: int8
splits:
- name: train
num_bytes: 80377569
num_examples: 13061
- name: validation
num_bytes: 6438977
num_examples: 995
- name: test
num_bytes: 6443893
num_examples: 997
download_size: 16067303
dataset_size: 93260439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
BangumiBase/princesstutu | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Princess Tutu
This is the image base of bangumi Princess Tutu, we detected 23 characters, 2179 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 190 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 536 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 67 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 21 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 288 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 20 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 19 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 23 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 22 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 250 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 352 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 27 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 23 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 35 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 22 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 19 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 38 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 13 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 10 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 16 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 67 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 14 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 107 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
Zahidhasancodedev/LunaChat-v-0 | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
tags:
- not-for-all-audiences
size_categories:
- n<1K
--- |
hk-kaden-kim/uzh-hs23-etsp-eval-multi-base-line | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: test
num_bytes: 5404740.0
num_examples: 100
download_size: 5387322
dataset_size: 5404740.0
---
# Dataset Card for "uzh-hs23-etsp-eval-multi-base-line"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
man4j/aisha_v3_safety | ---
dataset_info:
features:
- name: instruct
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 128030
num_examples: 100
download_size: 16479
dataset_size: 128030
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yicozy/dataset_study_dictionary | ---
dataset_info:
features:
- name: study_ids
sequence: string
- name: corpus
dtype: string
splits:
- name: train
num_bytes: 1120563
num_examples: 7774
download_size: 118282
dataset_size: 1120563
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dataset_study_dictionary"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kfahn/snowflakes | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': snowflakes_blue
'1': snowflakes_white
splits:
- name: train
num_bytes: 121462186.0
num_examples: 201
download_size: 121442907
dataset_size: 121462186.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Leon-LLM/Leon-Chess-Dataset-19k-BOS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 10517279
num_examples: 19383
download_size: 5395613
dataset_size: 10517279
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Leon-Chess-Dataset-19k-BOS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khoomeik/gzipscale-0.51-10M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 37515315
num_examples: 39063
download_size: 15493545
dataset_size: 37515315
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gabrielaltay/pmcoa | ---
dataset_info:
features:
- name: text
dtype: string
- name: pmid
dtype: string
- name: accession_id
dtype: string
- name: license
dtype: string
- name: last_updated
dtype: string
- name: retracted
dtype: string
- name: citation
dtype: string
- name: decoded_as
dtype: string
- name: journal
dtype: string
- name: year
dtype: int32
- name: doi
dtype: string
- name: oa_subset
dtype: string
splits:
- name: train
num_bytes: 206274456770
num_examples: 4935779
- name: validation
num_bytes: 4046140044
num_examples: 87794
download_size: 111297924087
dataset_size: 210320596814
---
# Dataset Card for "pmcoa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DSSGxMunich/bplan_keyword_extraction | ---
license: mit
---
# Dataset Card for Keyword Extraction
## Dataset Description
**Homepage:** [DSSGx Munich](https://sites.google.com/view/dssgx-munich-2023/startseite) organization page.
**Repository:** [GitHub](https://github.com/DSSGxMunich/land-sealing-dataset-and-analysis).
### Dataset Summary
This folder contains the exact keyword extraction and agent information extraction datasets.
## Dataset Structure
### Folder structure
- **exact_search**
- baunvo_keywords.csv -> appearance of BauNVO keywords in each document.
- hochwasser_keywords.csv -> appearance of hochwasser-related keywords in each document.
- **knowledge_extraction_agent**
- fh.json -> length of firsthöhe detected by agent and result from fuzzy keyword search.
- gfz.json -> Geschossflächenzahl detected by agent and result from fuzzy keyword search.
- grz.json -> Grundflächenzahl detected by agent and result from fuzzy keyword search.
- max_h.json -> Maximale gebäudehöhe detected by agent and result from fuzzy keyword search.
- min_h.json -> Minimale gebäudehöhe detected by agent and result from fuzzy keyword search.
- th.json -> Traufhöhe detected by agent and result from fuzzy keyword search.
### Data Fields
- **baunvo_keywords.csv:**
- filename: name of PDF file that was extracted.
- columns baunvo-XX and 13b: names of the categories that were searched for, and keywords that appeared matching that category.
- **hochwasser_keywords.csv:**
- filename: name of PDF file that was extracted.
- contextualised_keyword: paragraph context in which the exact keyword appears.
- actual_keyword: actual keyword searched for.
- category: category of hochwasser keyword(hq100, hqhaufig, hqextrem)
- All the files in **knowledge_extraction_agent** are .json files which contain the following structure:
- id: id of document extracted.
- keyword_input: fuzzy keyword input for the value extraction (context paragraph).
- keyword_agent_response: result of the agent.
- keyword_extracted_value: extracted value from agent.
- validation: validation of result.
## Dataset Creation
#### Initial Data Collection and Normalization
This is the result of the keyword extraction from the document_texts.csv file. The exact keyword extraction was done by selecting a set of relevant keywords and searching for them in the text. Meanwhile, the agent keyword extraction is the result of searching for certain keywords using fuzzy search to get the context surrounding them, and extracting relevant values with GPT.
## Considerations for Using the Data
### Discussion of Biases
The results of this keyword and agent results were NOT validated manually. Therefore, this is why we provide the contextual paragraph of the values: the information should be double-checked by professionals.
|
one-sec-cv12/chunk_150 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 15983059536.125
num_examples: 166407
download_size: 13526195097
dataset_size: 15983059536.125
---
# Dataset Card for "chunk_150"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jondurbin/contextual-dpo-v0.1 | ---
license: cc-by-4.0
---
# Contextual DPO

## Overview
This is a dataset meant to enhance adherence to provided context (e.g., for RAG applications) and reduce hallucinations, specifically using the airoboros context-obedient question answer format.
The chosen values were generated with [airoboros](https://github.com/jondurbin/airoboros) using only the `contextual` and `counterfactual_contextual` instructors.
The rejected values were generated using [mpt-30b-instruct](https://huggingface.co/mosaicml/mpt-30b-instruct)
### Dataset format
The format for a contextual prompt is as follows:
```
BEGININPUT
BEGINCONTEXT
[key0: value0]
[key1: value1]
... other metdata ...
ENDCONTEXT
[insert your text blocks here]
ENDINPUT
[add as many other blocks, in the exact same format]
BEGININSTRUCTION
[insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.]
ENDINSTRUCTION
```
I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it.
- `BEGININPUT` - denotes a new input block
- `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block
- `ENDCONTEXT` - denotes the end of the metadata block for the current input
- [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context.
- `ENDINPUT` - denotes the end of the current input block
- [repeat as many input blocks in this format as you want]
- `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above.
- [instruction(s)]
- `ENDINSTRUCTION` - denotes the end of instruction set
Here's a trivial, but important example to prove the point:
```
BEGININPUT
BEGINCONTEXT
date: 2021-01-01
url: https://web.site/123
ENDCONTEXT
In a shocking turn of events, blueberries are now green, but will be sticking with the same name.
ENDINPUT
BEGININSTRUCTION
What color are bluberries? Source?
ENDINSTRUCTION
```
And the expected response:
```
Blueberries are now green.
Source:
date: 2021-01-01
url: https://web.site/123
```
### References in response
As shown in the example, the dataset includes many examples of including source details in the response, when the question asks for source/citation/references.
Why do this? Well, the R in RAG seems to be the weakest link in the chain.
Retrieval accuracy, depending on many factors including the overall dataset size, can be quite low.
This accuracy increases when retrieving more documents, but then you have the issue of actually using
the retrieved documents in prompts. If you use one prompt per document (or document chunk), you know
exactly which document the answer came from, so there's no issue. If, however, you include multiple
chunks in a single prompt, it's useful to include the specific reference chunk(s) used to generate the
response, rather than naively including references to all of the chunks included in the prompt.
For example, suppose I have two documents:
```
url: http://foo.bar/1
Strawberries are tasty.
url: http://bar.foo/2
The cat is blue.
```
If the question being asked is `What color is the cat?`, I would only expect the 2nd document to be referenced in the response, as the other link is irrelevant.
### Contribute
If you're interested in new functionality/datasets, take a look at [bagel repo](https://github.com/jondurbin/bagel) and [airoboros](https://github.com/jondurbin/airoboros) and either make a PR or open an issue with details.
To help me with the fine-tuning costs, dataset generation, etc., please use one of the following:
- https://bmc.link/jondurbin
- ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11
- BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf |
Utshav/mini-platypus | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4123358
num_examples: 1000
download_size: 2203627
dataset_size: 4123358
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
OrdalieTech/Ordalie-FR-Reranking-benchmark | ---
dataset_info:
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: test
num_bytes: 22164217
num_examples: 1961
download_size: 11999345
dataset_size: 22164217
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
slone/bak_rus_eng_2M2023_scored | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: ba
dtype: string
- name: ru
dtype: string
- name: source
dtype: string
- name: cosine_sim
dtype: float64
- name: cross_encoder_sim
dtype: float64
- name: joint_sim
dtype: float64
- name: ru_len
dtype: int64
- name: en
dtype: string
- name: en_ru_sim
dtype: float64
splits:
- name: train
num_bytes: 1070778392
num_examples: 2228224
download_size: 620446960
dataset_size: 1070778392
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bak_rus_eng_2M2023_scored"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-adversarial_qa-adversarialQA-3783aa-1711959846 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- adversarial_qa
eval_info:
task: extractive_question_answering
model: mrp/bert-finetuned-squad
metrics: []
dataset_name: adversarial_qa
dataset_config: adversarialQA
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: mrp/bert-finetuned-squad
* Dataset: adversarial_qa
* Config: adversarialQA
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mbartolo](https://huggingface.co/mbartolo) for evaluating this model. |
CyberHarem/akishimo_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akishimo/秋霜 (Kantai Collection)
This is the dataset of akishimo/秋霜 (Kantai Collection), containing 139 images and their tags.
The core tags of this character are `asymmetrical_hair, multicolored_hair, gradient_hair, brown_hair, short_hair, hair_ornament, leaf_hair_ornament, grey_eyes, bangs, asymmetrical_bangs, grey_hair, bow, aqua_bow, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 139 | 141.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akishimo_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 139 | 88.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akishimo_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 320 | 187.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akishimo_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 139 | 126.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akishimo_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 320 | 254.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akishimo_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akishimo_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, short_hair_with_long_locks, solo, blush, collarbone, looking_at_viewer, simple_background, bra, small_breasts, white_background, orange_panties, cowboy_shot, navel, open_mouth, underwear_only |
| 1 | 18 |  |  |  |  |  | 1girl, long_sleeves, purple_dress, school_uniform, short_hair_with_long_locks, solo, white_shirt, aqua_bowtie, upper_body, smile, looking_at_viewer, simple_background, open_mouth, white_background, one-hour_drawing_challenge |
| 2 | 8 |  |  |  |  |  | 1girl, aqua_bowtie, grey_thighhighs, purple_dress, school_uniform, short_hair_with_long_locks, solo, white_shirt, lace-up_boots, long_sleeves, pleated_dress, smile, full_body, simple_background, standing, white_background, blue_bowtie, blush, looking_at_viewer, seamed_legwear, chibi, sitting |
| 3 | 5 |  |  |  |  |  | 1girl, adapted_costume, detached_collar, fake_animal_ears, playboy_bunny, purple_leotard, rabbit_ears, rabbit_tail, short_hair_with_long_locks, solo, strapless_leotard, wrist_cuffs, fishnet_pantyhose, seamed_legwear, smile, thighband_pantyhose, aqua_bowtie, blue_bowtie, highleg_leotard, leaning_forward, simple_background, small_breasts, white_background, full_body, grey_pantyhose, high_heels, looking_at_viewer, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | short_hair_with_long_locks | solo | blush | collarbone | looking_at_viewer | simple_background | bra | small_breasts | white_background | orange_panties | cowboy_shot | navel | open_mouth | underwear_only | long_sleeves | purple_dress | school_uniform | white_shirt | aqua_bowtie | upper_body | smile | one-hour_drawing_challenge | grey_thighhighs | lace-up_boots | pleated_dress | full_body | standing | blue_bowtie | seamed_legwear | chibi | sitting | adapted_costume | detached_collar | fake_animal_ears | playboy_bunny | purple_leotard | rabbit_ears | rabbit_tail | strapless_leotard | wrist_cuffs | fishnet_pantyhose | thighband_pantyhose | highleg_leotard | leaning_forward | grey_pantyhose | high_heels |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------------------|:-------|:--------|:-------------|:--------------------|:--------------------|:------|:----------------|:-------------------|:-----------------|:--------------|:--------|:-------------|:-----------------|:---------------|:---------------|:-----------------|:--------------|:--------------|:-------------|:--------|:-----------------------------|:------------------|:----------------|:----------------|:------------|:-----------|:--------------|:-----------------|:--------|:----------|:------------------|:------------------|:-------------------|:----------------|:-----------------|:--------------|:--------------|:--------------------|:--------------|:--------------------|:----------------------|:------------------|:------------------|:-----------------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | X | X | | | X | X | | | X | | | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | X | X | X | | X | X | | | X | | | | | | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | | X | X | | X | X | | | | | | | | | | X | | X | | | | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Gaxys/wayuu_spa_dict | ---
dataset_info:
features:
- name: translation
struct:
- name: guc
dtype: string
- name: spa
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 84870
num_examples: 2183
download_size: 55343
dataset_size: 84870
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Haziqsayyed/gpt-expressions | ---
license: afl-3.0
task_categories:
- summarization
language:
- en
pretty_name: Rule and Expressions
size_categories:
- n<1K
tags:
- code
--- |
autoevaluate/autoeval-eval-jeffdshen__neqa2_8shot-jeffdshen__neqa2_8shot-959823-1853063400 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/neqa2_8shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-350m_eval
metrics: []
dataset_name: jeffdshen/neqa2_8shot
dataset_config: jeffdshen--neqa2_8shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-350m_eval
* Dataset: jeffdshen/neqa2_8shot
* Config: jeffdshen--neqa2_8shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
BangumiBase/shikkakumonnosaikyoukenja | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Shikkaku Mon No Saikyou Kenja
This is the image base of bangumi Shikkaku Mon no Saikyou Kenja, we detected 35 characters, 2876 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 893 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 9 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 19 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 99 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 16 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 27 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 19 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 31 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 22 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 86 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 14 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 22 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 13 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 8 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 19 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 31 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 10 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 9 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 15 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 10 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 9 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 13 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 293 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 11 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 27 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 469 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 23 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 7 | [Download](27/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 28 | 6 | [Download](28/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 29 | 12 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 467 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 8 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 6 | [Download](32/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 33 | 6 | [Download](33/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 147 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
0-hero/Matter-0.1-Slim-D | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test | ---
pretty_name: Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T16:56:58.470467](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test/blob/main/results_2023-12-23T16-56-58.470467.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6653838410064873,\n\
\ \"acc_stderr\": 0.031640270521971985,\n \"acc_norm\": 0.6660954003934071,\n\
\ \"acc_norm_stderr\": 0.03228645429155969,\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n\
\ \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.7090443686006825,\n \"acc_norm_stderr\": 0.013273077865907595\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7124078868751245,\n\
\ \"acc_stderr\": 0.004517148434180491,\n \"acc_norm\": 0.8829914359689305,\n\
\ \"acc_norm_stderr\": 0.0032077357692780416\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.0498887651569859,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.0498887651569859\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.49206349206349204,\n \"acc_stderr\": 0.02574806587167328,\n \"\
acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.02574806587167328\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644244,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \
\ \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590177,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39106145251396646,\n\
\ \"acc_stderr\": 0.016320763763808383,\n \"acc_norm\": 0.39106145251396646,\n\
\ \"acc_norm_stderr\": 0.016320763763808383\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.025403832978179615,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.025403832978179615\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262192,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262192\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49282920469361147,\n\
\ \"acc_stderr\": 0.012768922739553308,\n \"acc_norm\": 0.49282920469361147,\n\
\ \"acc_norm_stderr\": 0.012768922739553308\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.026556519470041513,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.026556519470041513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.01882421951270621,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.01882421951270621\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5716034271725826,\n\
\ \"mc1_stderr\": 0.017323088597314743,\n \"mc2\": 0.7180055234145617,\n\
\ \"mc2_stderr\": 0.015031705179783715\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6467020470053071,\n \
\ \"acc_stderr\": 0.013166337192115683\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T16-56-58.470467.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- '**/details_harness|winogrande|5_2023-12-23T16-56-58.470467.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T16-56-58.470467.parquet'
- config_name: results
data_files:
- split: 2023_12_23T16_56_58.470467
path:
- results_2023-12-23T16-56-58.470467.parquet
- split: latest
path:
- results_2023-12-23T16-56-58.470467.parquet
---
# Dataset Card for Evaluation run of Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test](https://huggingface.co/Weyaxi/SauerkrautLM-UNA-SOLAR-Instruct-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T16:56:58.470467](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__SauerkrautLM-UNA-SOLAR-Instruct-test/blob/main/results_2023-12-23T16-56-58.470467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6653838410064873,
"acc_stderr": 0.031640270521971985,
"acc_norm": 0.6660954003934071,
"acc_norm_stderr": 0.03228645429155969,
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7090443686006825,
"acc_norm_stderr": 0.013273077865907595
},
"harness|hellaswag|10": {
"acc": 0.7124078868751245,
"acc_stderr": 0.004517148434180491,
"acc_norm": 0.8829914359689305,
"acc_norm_stderr": 0.0032077357692780416
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.44,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.02574806587167328,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.02574806587167328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644244,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590177,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39106145251396646,
"acc_stderr": 0.016320763763808383,
"acc_norm": 0.39106145251396646,
"acc_norm_stderr": 0.016320763763808383
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.025403832978179615,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.025403832978179615
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262192,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49282920469361147,
"acc_stderr": 0.012768922739553308,
"acc_norm": 0.49282920469361147,
"acc_norm_stderr": 0.012768922739553308
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.01882421951270621,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.01882421951270621
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5716034271725826,
"mc1_stderr": 0.017323088597314743,
"mc2": 0.7180055234145617,
"mc2_stderr": 0.015031705179783715
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343338
},
"harness|gsm8k|5": {
"acc": 0.6467020470053071,
"acc_stderr": 0.013166337192115683
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AdapterOcean/med_alpaca_standardized_cluster_14_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 12504821
num_examples: 23822
download_size: 6310372
dataset_size: 12504821
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_14_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KardelRuveyda/chatbotSentences-mini | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: chatbottrainsentence
dtype: string
- name: train_sentences_length
dtype: int64
splits:
- name: train
num_bytes: 139373705.28782204
num_examples: 362520
- name: validation
num_bytes: 15486351.712177973
num_examples: 40281
download_size: 96790843
dataset_size: 154860057.0
---
# Dataset Card for "chatbotSentences-mini"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bgspaditya/maroon100k | ---
license: mit
---
|
guangyil/yelp_short | ---
license: artistic-2.0
dataset_info:
features:
- name: bert_token
sequence: int64
- name: gpt2_token
sequence: int64
splits:
- name: train
num_bytes: 89488944.91780378
num_examples: 446811
- name: test
num_bytes: 89727.08219622188
num_examples: 448
download_size: 21436068
dataset_size: 89578672.0
---
|
CyberHarem/nicholas_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of nicholas/ニコラス/尼古拉斯 (Azur Lane)
This is the dataset of nicholas/ニコラス/尼古拉斯 (Azur Lane), containing 70 images and their tags.
The core tags of this character are `ahoge, long_hair, red_eyes, yellow_eyes, heterochromia, mole_under_eye, bangs, mole, blue_hair, twintails, very_long_hair, low_twintails, breasts, hat, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 70 | 108.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nicholas_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 70 | 56.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nicholas_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 173 | 124.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nicholas_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 70 | 93.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nicholas_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 173 | 182.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nicholas_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nicholas_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, blush, collarbone, white_thighhighs, dress, simple_background, long_sleeves, closed_mouth, off_shoulder, sleeves_past_wrists, white_background, wide_sleeves, animal, chick |
| 1 | 5 |  |  |  |  |  | 1girl, blue_one-piece_swimsuit, blush, double_bun, looking_at_viewer, school_swimsuit, solo, white_thighhighs, innertube, water, covered_navel, hose, ass, bare_shoulders, bucket, collarbone, flower, from_behind, hair_ribbon, holding, long_sleeves, looking_back, off_shoulder, on_back, outdoors, parted_lips, thighs, wet, white_hair |
| 2 | 11 |  |  |  |  |  | 1girl, solo, black_thighhighs, blush, looking_at_viewer, pleated_skirt, serafuku, official_alternate_costume, red_neckerchief, white_hair, hair_ornament, heart, long_sleeves, pink_sweater, valentine, envelope, love_letter, miniskirt, no_shoes, school_bag, bell, black_choker, black_sailor_collar, black_skirt, cardigan, collarbone, off_shoulder, school_desk, sitting_on_desk, uwabaki, white_shirt |
| 3 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, nurse_cap, solo, blush, heart, white_apron, white_pantyhose, collared_dress, pink_dress, puffy_short_sleeves, syringe, wrist_cuffs, bandages, full_body, holding_clipboard, open_mouth, oversized_object, pink_footwear, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | looking_at_viewer | solo | blush | collarbone | white_thighhighs | dress | simple_background | long_sleeves | closed_mouth | off_shoulder | sleeves_past_wrists | white_background | wide_sleeves | animal | chick | blue_one-piece_swimsuit | double_bun | school_swimsuit | innertube | water | covered_navel | hose | ass | bucket | flower | from_behind | hair_ribbon | holding | looking_back | on_back | outdoors | parted_lips | thighs | wet | white_hair | black_thighhighs | pleated_skirt | serafuku | official_alternate_costume | red_neckerchief | hair_ornament | heart | pink_sweater | valentine | envelope | love_letter | miniskirt | no_shoes | school_bag | bell | black_choker | black_sailor_collar | black_skirt | cardigan | school_desk | sitting_on_desk | uwabaki | white_shirt | nurse_cap | white_apron | white_pantyhose | collared_dress | pink_dress | puffy_short_sleeves | syringe | wrist_cuffs | bandages | full_body | holding_clipboard | open_mouth | oversized_object | pink_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:-------|:--------|:-------------|:-------------------|:--------|:--------------------|:---------------|:---------------|:---------------|:----------------------|:-------------------|:---------------|:---------|:--------|:--------------------------|:-------------|:------------------|:------------|:--------|:----------------|:-------|:------|:---------|:---------|:--------------|:--------------|:----------|:---------------|:----------|:-----------|:--------------|:---------|:------|:-------------|:-------------------|:----------------|:-----------|:-----------------------------|:------------------|:----------------|:--------|:---------------|:------------|:-----------|:--------------|:------------|:-----------|:-------------|:-------|:---------------|:----------------------|:--------------|:-----------|:--------------|:------------------|:----------|:--------------|:------------|:--------------|:------------------|:-----------------|:-------------|:----------------------|:----------|:--------------|:-----------|:------------|:--------------------|:-------------|:-------------------|:----------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | | X | X | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
liuyanchen1015/MULTI_VALUE_cola_never_negator | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1324
num_examples: 16
- name: test
num_bytes: 1249
num_examples: 15
- name: train
num_bytes: 7774
num_examples: 96
download_size: 10683
dataset_size: 10347
---
# Dataset Card for "MULTI_VALUE_cola_never_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.