datasetId
stringlengths
2
117
card
stringlengths
19
1.01M
AlekseyKorshuk/chai-chatgpt-fullserved-chatml-deduplicated
--- dataset_info: features: - name: conversation list: - name: content dtype: string - name: do_train dtype: bool - name: role dtype: string splits: - name: train num_bytes: 242557041.6757571 num_examples: 69203 download_size: 133560349 dataset_size: 242557041.6757571 --- # Dataset Card for "chai-chatgpt-fullserved-chatml-deduplicated" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
casey-martin/oa_cpp_annotate_gen
--- task_categories: - question-answering - text-classification language: - en --- ### Dataset Description This dataset, compiled by Brendan Dolan-Gavitt, contains ~100 thousand `c++` functions and GPT-3.5 turbo-generated summaries of the code's purpose. An example of Brendan's original prompt and GPT-3.5's summary may be found below. ``` int gg_set_focus_pos(gg_widget_t *widget, int x, int y) { return 1; } Q. What language is the above code written in? A. C/C++. Q. What is the purpose of the above code? A. This code defines a function called `gg_set_focus_pos` that takes a pointer to a `gg_widget_t` structure and two integer arguments `x` and `y`. The function returns an integer value of 1. The purpose of this function is not clear from this code snippet alone, as it does not contain any implementation details. However, based on the function name and arguments, it is likely that this function sets the focus position of a graphical user interface (GUI) widget to the specified `x` and `y` coordinates. Tags: GUI, widget, focus, position, function. ``` The link to the dataset's announcement may be found here: https://twitter.com/moyix/status/1644355889602654210
Memin25/minidatasets
--- dataset_info: features: - name: review dtype: string - name: review_length dtype: int64 splits: - name: train num_bytes: 292505.5375 num_examples: 3586 - name: validation num_bytes: 32545.93125 num_examples: 399 download_size: 177030 dataset_size: 325051.46875 --- # Dataset Card for "minidataset" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Vinisf/greg
--- license: openrail ---
sartmis1/text2sql-wikisql-spider
--- configs: - config_name: default data_files: - split: train path: data/train-* dataset_info: features: - name: messages dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 11996797 num_examples: 63355 download_size: 4004638 dataset_size: 11996797 --- # Dataset Card for "text2sql-wikisql-spider" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Dampish/eliai_2.7bh
--- dataset_info: features: - name: input dtype: string - name: output dtype: string - name: instruction dtype: string - name: input_ids sequence: int32 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 2528633 num_examples: 200 download_size: 700757 dataset_size: 2528633 --- # Dataset Card for "eliai_2.7bh" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
rewoo/planner_instruction_tuning_2k
--- license: mit --- *Bootstrap 2k Planner finetuning dataset for ReWOO.* It is a mixture of "correct" HotpotQA and TriviaQA task planning trajectories in ReWOO Framework.
adamo1139/AEZAKMI_v2_sharegpt
--- license: other license_name: other license_link: LICENSE --- I moved AEZAKMI V2 in sharegpt format to a different repo so that it's easier to use with HF datasets library.
Nlmcasef/Caseextools
--- license: bigcode-openrail-m task_categories: - text-classification - table-question-answering - feature-extraction - token-classification - question-answering language: - ae - aa - ab - af - ak - am tags: - code - finance - legal - medical - webdataset - synthetic size_categories: - 10K<n<100K ---
darkproger/librispeech_asr
--- license: cc-by-4.0 --- This is a dataset is a fork of [librispeech_asr](https://huggingface.co/datasets/librispeech_asr) that defines each original split (like train-clean-100) as a split (named `train.clean.100`, with dots instead of hyphens). This allows you to download each part separately. This fork also reports a `path` for each sample accurately.
open-llm-leaderboard/details_allknowingroger__NeuralDolphin-7B-slerp
--- pretty_name: Evaluation run of allknowingroger/NeuralDolphin-7B-slerp dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [allknowingroger/NeuralDolphin-7B-slerp](https://huggingface.co/allknowingroger/NeuralDolphin-7B-slerp)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__NeuralDolphin-7B-slerp\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-04-10T19:59:48.653872](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__NeuralDolphin-7B-slerp/blob/main/results_2024-04-10T19-59-48.653872.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6523407261257382,\n\ \ \"acc_stderr\": 0.0320281547439977,\n \"acc_norm\": 0.6528519880112322,\n\ \ \"acc_norm_stderr\": 0.03268247380255343,\n \"mc1\": 0.44430844553243576,\n\ \ \"mc1_stderr\": 0.01739458625074317,\n \"mc2\": 0.608533071036163,\n\ \ \"mc2_stderr\": 0.015352819851955538\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.64419795221843,\n \"acc_stderr\": 0.013990571137918762,\n\ \ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277366\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6749651463851822,\n\ \ \"acc_stderr\": 0.004674306182532136,\n \"acc_norm\": 0.8568014339772954,\n\ \ \"acc_norm_stderr\": 0.0034955936625207483\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \ \ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\ \ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\ \ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \ \ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\ \ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\ \ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\ \ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \ \ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\ \ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\ \ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\ \ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\ \ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\ \ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531006,\n \"\ acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531006\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\ \ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\ \ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \ \ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\ \ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\ \ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\ \ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\ \ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\ acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\ \ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6820512820512821,\n \"acc_stderr\": 0.023610884308927865,\n\ \ \"acc_norm\": 0.6820512820512821,\n \"acc_norm_stderr\": 0.023610884308927865\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \ \ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887027,\n\ \ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887027\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\ acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\ acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\ acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \ \ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\ \ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\ \ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\ \ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\ acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\ \ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\ \ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\ \ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\ \ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\ \ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\ \ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\ \ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \ \ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\ \ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\ \ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\ \ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n\ \ \"acc_stderr\": 0.016469814928406164,\n \"acc_norm\": 0.4134078212290503,\n\ \ \"acc_norm_stderr\": 0.016469814928406164\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\ \ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\ \ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \ \ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\ : 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291463,\n \"\ acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291463\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\ \ \"acc_stderr\": 0.01273239828619044,\n \"acc_norm\": 0.46153846153846156,\n\ \ \"acc_norm_stderr\": 0.01273239828619044\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\ \ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \ \ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\ \ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\ \ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\ \ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\ \ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\ \ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \ \ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\ \ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\ \ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\ \ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\ \ \"mc1_stderr\": 0.01739458625074317,\n \"mc2\": 0.608533071036163,\n\ \ \"mc2_stderr\": 0.015352819851955538\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218327\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \ \ \"acc_stderr\": 0.012791037227336034\n }\n}\n```" repo_url: https://huggingface.co/allknowingroger/NeuralDolphin-7B-slerp leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|arc:challenge|25_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-04-10T19-59-48.653872.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|gsm8k|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hellaswag|10_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-management|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-59-48.653872.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-management|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T19-59-48.653872.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|truthfulqa:mc|0_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-04-10T19-59-48.653872.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_04_10T19_59_48.653872 path: - '**/details_harness|winogrande|5_2024-04-10T19-59-48.653872.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-04-10T19-59-48.653872.parquet' - config_name: results data_files: - split: 2024_04_10T19_59_48.653872 path: - results_2024-04-10T19-59-48.653872.parquet - split: latest path: - results_2024-04-10T19-59-48.653872.parquet --- # Dataset Card for Evaluation run of allknowingroger/NeuralDolphin-7B-slerp <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [allknowingroger/NeuralDolphin-7B-slerp](https://huggingface.co/allknowingroger/NeuralDolphin-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_allknowingroger__NeuralDolphin-7B-slerp", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-04-10T19:59:48.653872](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__NeuralDolphin-7B-slerp/blob/main/results_2024-04-10T19-59-48.653872.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6523407261257382, "acc_stderr": 0.0320281547439977, "acc_norm": 0.6528519880112322, "acc_norm_stderr": 0.03268247380255343, "mc1": 0.44430844553243576, "mc1_stderr": 0.01739458625074317, "mc2": 0.608533071036163, "mc2_stderr": 0.015352819851955538 }, "harness|arc:challenge|25": { "acc": 0.64419795221843, "acc_stderr": 0.013990571137918762, "acc_norm": 0.6774744027303754, "acc_norm_stderr": 0.013659980894277366 }, "harness|hellaswag|10": { "acc": 0.6749651463851822, "acc_stderr": 0.004674306182532136, "acc_norm": 0.8568014339772954, "acc_norm_stderr": 0.0034955936625207483 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.3, "acc_stderr": 0.046056618647183814, "acc_norm": 0.3, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7039473684210527, "acc_stderr": 0.03715062154998904, "acc_norm": 0.7039473684210527, "acc_norm_stderr": 0.03715062154998904 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.62, "acc_stderr": 0.048783173121456316, "acc_norm": 0.62, "acc_norm_stderr": 0.048783173121456316 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.56, "acc_stderr": 0.049888765156985884, "acc_norm": 0.56, "acc_norm_stderr": 0.049888765156985884 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.32, "acc_stderr": 0.04688261722621504, "acc_norm": 0.32, "acc_norm_stderr": 0.04688261722621504 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.0355068398916558, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.0355068398916558 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.4117647058823529, "acc_stderr": 0.04897104952726366, "acc_norm": 0.4117647058823529, "acc_norm_stderr": 0.04897104952726366 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932261, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932261 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.574468085106383, "acc_stderr": 0.03232146916224468, "acc_norm": 0.574468085106383, "acc_norm_stderr": 0.03232146916224468 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5655172413793104, "acc_stderr": 0.04130740879555497, "acc_norm": 0.5655172413793104, "acc_norm_stderr": 0.04130740879555497 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.43915343915343913, "acc_stderr": 0.025559920550531006, "acc_norm": 0.43915343915343913, "acc_norm_stderr": 0.025559920550531006 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.48412698412698413, "acc_stderr": 0.04469881854072606, "acc_norm": 0.48412698412698413, "acc_norm_stderr": 0.04469881854072606 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.37, "acc_stderr": 0.04852365870939099, "acc_norm": 0.37, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7806451612903226, "acc_stderr": 0.023540799358723295, "acc_norm": 0.7806451612903226, "acc_norm_stderr": 0.023540799358723295 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.49261083743842365, "acc_stderr": 0.035176035403610084, "acc_norm": 0.49261083743842365, "acc_norm_stderr": 0.035176035403610084 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7818181818181819, "acc_stderr": 0.03225078108306289, "acc_norm": 0.7818181818181819, "acc_norm_stderr": 0.03225078108306289 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8131313131313131, "acc_stderr": 0.027772533334218967, "acc_norm": 0.8131313131313131, "acc_norm_stderr": 0.027772533334218967 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.8808290155440415, "acc_stderr": 0.023381935348121437, "acc_norm": 0.8808290155440415, "acc_norm_stderr": 0.023381935348121437 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6820512820512821, "acc_stderr": 0.023610884308927865, "acc_norm": 0.6820512820512821, "acc_norm_stderr": 0.023610884308927865 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.35185185185185186, "acc_stderr": 0.029116617606083018, "acc_norm": 0.35185185185185186, "acc_norm_stderr": 0.029116617606083018 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6932773109243697, "acc_stderr": 0.029953823891887027, "acc_norm": 0.6932773109243697, "acc_norm_stderr": 0.029953823891887027 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8440366972477065, "acc_stderr": 0.01555580271359017, "acc_norm": 0.8440366972477065, "acc_norm_stderr": 0.01555580271359017 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5138888888888888, "acc_stderr": 0.03408655867977749, "acc_norm": 0.5138888888888888, "acc_norm_stderr": 0.03408655867977749 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8186274509803921, "acc_stderr": 0.027044621719474082, "acc_norm": 0.8186274509803921, "acc_norm_stderr": 0.027044621719474082 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7932489451476793, "acc_stderr": 0.026361651668389094, "acc_norm": 0.7932489451476793, "acc_norm_stderr": 0.026361651668389094 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6681614349775785, "acc_stderr": 0.03160295143776679, "acc_norm": 0.6681614349775785, "acc_norm_stderr": 0.03160295143776679 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7862595419847328, "acc_stderr": 0.0359546161177469, "acc_norm": 0.7862595419847328, "acc_norm_stderr": 0.0359546161177469 }, "harness|hendrycksTest-international_law|5": { "acc": 0.7851239669421488, "acc_stderr": 0.037494924487096966, "acc_norm": 0.7851239669421488, "acc_norm_stderr": 0.037494924487096966 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8148148148148148, "acc_stderr": 0.03755265865037182, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.03755265865037182 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.4642857142857143, "acc_stderr": 0.04733667890053756, "acc_norm": 0.4642857142857143, "acc_norm_stderr": 0.04733667890053756 }, "harness|hendrycksTest-management|5": { "acc": 0.7864077669902912, "acc_stderr": 0.040580420156460344, "acc_norm": 0.7864077669902912, "acc_norm_stderr": 0.040580420156460344 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.02158649400128137, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.02158649400128137 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.04292346959909282, "acc_norm": 0.76, "acc_norm_stderr": 0.04292346959909282 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8314176245210728, "acc_stderr": 0.013387895731543604, "acc_norm": 0.8314176245210728, "acc_norm_stderr": 0.013387895731543604 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500104, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500104 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.4134078212290503, "acc_stderr": 0.016469814928406164, "acc_norm": 0.4134078212290503, "acc_norm_stderr": 0.016469814928406164 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7009646302250804, "acc_stderr": 0.026003301117885135, "acc_norm": 0.7009646302250804, "acc_norm_stderr": 0.026003301117885135 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.75, "acc_stderr": 0.02409347123262133, "acc_norm": 0.75, "acc_norm_stderr": 0.02409347123262133 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.45390070921985815, "acc_stderr": 0.029700453247291463, "acc_norm": 0.45390070921985815, "acc_norm_stderr": 0.029700453247291463 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46153846153846156, "acc_stderr": 0.01273239828619044, "acc_norm": 0.46153846153846156, "acc_norm_stderr": 0.01273239828619044 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6727941176470589, "acc_stderr": 0.028501452860396556, "acc_norm": 0.6727941176470589, "acc_norm_stderr": 0.028501452860396556 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.684640522875817, "acc_stderr": 0.01879808628488689, "acc_norm": 0.684640522875817, "acc_norm_stderr": 0.01879808628488689 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6818181818181818, "acc_stderr": 0.04461272175910509, "acc_norm": 0.6818181818181818, "acc_norm_stderr": 0.04461272175910509 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7510204081632653, "acc_stderr": 0.027682979522960238, "acc_norm": 0.7510204081632653, "acc_norm_stderr": 0.027682979522960238 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8407960199004975, "acc_stderr": 0.02587064676616914, "acc_norm": 0.8407960199004975, "acc_norm_stderr": 0.02587064676616914 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.84, "acc_stderr": 0.03684529491774709, "acc_norm": 0.84, "acc_norm_stderr": 0.03684529491774709 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8304093567251462, "acc_stderr": 0.02878210810540171, "acc_norm": 0.8304093567251462, "acc_norm_stderr": 0.02878210810540171 }, "harness|truthfulqa:mc|0": { "mc1": 0.44430844553243576, "mc1_stderr": 0.01739458625074317, "mc2": 0.608533071036163, "mc2_stderr": 0.015352819851955538 }, "harness|winogrande|5": { "acc": 0.8042620363062352, "acc_stderr": 0.011151145042218327 }, "harness|gsm8k|5": { "acc": 0.6853677028051555, "acc_stderr": 0.012791037227336034 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
DynamicSuperbPrivate/SpeakerVerification_Voxceleb1Train
--- dataset_info: features: - name: file dtype: string - name: audio dtype: audio - name: file2 dtype: string - name: instruction dtype: string - name: label dtype: string splits: - name: train num_bytes: 3189320201.0 num_examples: 12000 - name: validation num_bytes: 734115645.0 num_examples: 2609 download_size: 3908622443 dataset_size: 3923435846.0 --- # Dataset Card for "SpeakerVerification_VoxCeleb1Train" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Open-Orca__Mixtral-SlimOrca-8x7B
--- pretty_name: Evaluation run of Open-Orca/Mixtral-SlimOrca-8x7B dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Open-Orca/Mixtral-SlimOrca-8x7B](https://huggingface.co/Open-Orca/Mixtral-SlimOrca-8x7B)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__Mixtral-SlimOrca-8x7B\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-14T10:54:31.511638](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mixtral-SlimOrca-8x7B/blob/main/results_2023-12-14T10-54-31.511638.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6772838776728083,\n\ \ \"acc_stderr\": 0.031217522187270183,\n \"acc_norm\": 0.6826319719917908,\n\ \ \"acc_norm_stderr\": 0.03182881958823988,\n \"mc1\": 0.3818849449204406,\n\ \ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5498097510513819,\n\ \ \"mc2_stderr\": 0.015613516175450912\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n\ \ \"acc_norm\": 0.6766211604095563,\n \"acc_norm_stderr\": 0.013669421630012144\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6669986058554073,\n\ \ \"acc_stderr\": 0.004703238534045805,\n \"acc_norm\": 0.8511252738498307,\n\ \ \"acc_norm_stderr\": 0.003552374531305199\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\ \ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\ \ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.032790004063100495,\n\ \ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.032790004063100495\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\ \ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \ \ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.769811320754717,\n \"acc_stderr\": 0.025907897122408163,\n\ \ \"acc_norm\": 0.769811320754717,\n \"acc_norm_stderr\": 0.025907897122408163\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\ \ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\ \ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\ \ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.45,\n \"acc_stderr\": 0.04999999999999999,\n \ \ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.04999999999999999\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\ \ \"acc_stderr\": 0.03550683989165579,\n \"acc_norm\": 0.6820809248554913,\n\ \ \"acc_norm_stderr\": 0.03550683989165579\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\ \ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932264\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6510638297872341,\n \"acc_stderr\": 0.03115852213135779,\n\ \ \"acc_norm\": 0.6510638297872341,\n \"acc_norm_stderr\": 0.03115852213135779\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\ \ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n\ \ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\ \ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"\ acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\ \ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.5317460317460317,\n\ \ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \ \ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8096774193548387,\n \"acc_stderr\": 0.022331707611823078,\n \"\ acc_norm\": 0.8096774193548387,\n \"acc_norm_stderr\": 0.022331707611823078\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n \"\ acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\ : 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\ \ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215286,\n \"\ acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215286\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223144,\n\ \ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223144\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \ \ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608463,\n \ \ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608463\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7394957983193278,\n \"acc_stderr\": 0.028510251512341947,\n\ \ \"acc_norm\": 0.7394957983193278,\n \"acc_norm_stderr\": 0.028510251512341947\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\ acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700469,\n \"\ acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700469\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5231481481481481,\n \"acc_stderr\": 0.034063153607115086,\n \"\ acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.034063153607115086\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"\ acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.869198312236287,\n \"acc_stderr\": 0.021948766059470756,\n \ \ \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.021948766059470756\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\ \ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\ \ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\ \ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8677685950413223,\n \"acc_stderr\": 0.0309227883204458,\n \"acc_norm\"\ : 0.8677685950413223,\n \"acc_norm_stderr\": 0.0309227883204458\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\ \ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\ \ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934723,\n\ \ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934723\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\ \ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\ \ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\ \ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\ \ \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n\ \ \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \ \ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8671775223499362,\n\ \ \"acc_stderr\": 0.012136303209884564,\n \"acc_norm\": 0.8671775223499362,\n\ \ \"acc_norm_stderr\": 0.012136303209884564\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7745664739884393,\n \"acc_stderr\": 0.02249723019096755,\n\ \ \"acc_norm\": 0.7745664739884393,\n \"acc_norm_stderr\": 0.02249723019096755\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\ \ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\ \ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\ \ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\ \ \"acc_stderr\": 0.024406162094668886,\n \"acc_norm\": 0.7556270096463023,\n\ \ \"acc_norm_stderr\": 0.024406162094668886\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7932098765432098,\n \"acc_stderr\": 0.022535006705942835,\n\ \ \"acc_norm\": 0.7932098765432098,\n \"acc_norm_stderr\": 0.022535006705942835\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \ \ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5052151238591917,\n\ \ \"acc_stderr\": 0.012769541449652547,\n \"acc_norm\": 0.5052151238591917,\n\ \ \"acc_norm_stderr\": 0.012769541449652547\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n\ \ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7107843137254902,\n \"acc_stderr\": 0.018342529845275908,\n \ \ \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.018342529845275908\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\ \ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\ \ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.02752963744017492,\n\ \ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.02752963744017492\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\ \ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\ \ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \ \ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\ \ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\ \ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n\ \ \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3818849449204406,\n\ \ \"mc1_stderr\": 0.017008101939163498,\n \"mc2\": 0.5498097510513819,\n\ \ \"mc2_stderr\": 0.015613516175450912\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8050513022888713,\n \"acc_stderr\": 0.011134099415938278\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45564821834723274,\n \ \ \"acc_stderr\": 0.013718194542485601\n }\n}\n```" repo_url: https://huggingface.co/Open-Orca/Mixtral-SlimOrca-8x7B leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|arc:challenge|25_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-14T10-54-31.511638.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|gsm8k|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hellaswag|10_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-14T10-54-31.511638.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-management|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-14T10-54-31.511638.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|truthfulqa:mc|0_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-14T10-54-31.511638.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_14T10_54_31.511638 path: - '**/details_harness|winogrande|5_2023-12-14T10-54-31.511638.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-14T10-54-31.511638.parquet' - config_name: results data_files: - split: 2023_12_14T10_54_31.511638 path: - results_2023-12-14T10-54-31.511638.parquet - split: latest path: - results_2023-12-14T10-54-31.511638.parquet --- # Dataset Card for Evaluation run of Open-Orca/Mixtral-SlimOrca-8x7B <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Open-Orca/Mixtral-SlimOrca-8x7B](https://huggingface.co/Open-Orca/Mixtral-SlimOrca-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Open-Orca__Mixtral-SlimOrca-8x7B", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-14T10:54:31.511638](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__Mixtral-SlimOrca-8x7B/blob/main/results_2023-12-14T10-54-31.511638.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6772838776728083, "acc_stderr": 0.031217522187270183, "acc_norm": 0.6826319719917908, "acc_norm_stderr": 0.03182881958823988, "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163498, "mc2": 0.5498097510513819, "mc2_stderr": 0.015613516175450912 }, "harness|arc:challenge|25": { "acc": 0.6390784982935154, "acc_stderr": 0.014034761386175452, "acc_norm": 0.6766211604095563, "acc_norm_stderr": 0.013669421630012144 }, "harness|hellaswag|10": { "acc": 0.6669986058554073, "acc_stderr": 0.004703238534045805, "acc_norm": 0.8511252738498307, "acc_norm_stderr": 0.003552374531305199 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6148148148148148, "acc_stderr": 0.04203921040156279, "acc_norm": 0.6148148148148148, "acc_norm_stderr": 0.04203921040156279 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.7960526315789473, "acc_stderr": 0.032790004063100495, "acc_norm": 0.7960526315789473, "acc_norm_stderr": 0.032790004063100495 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.66, "acc_stderr": 0.04760952285695238, "acc_norm": 0.66, "acc_norm_stderr": 0.04760952285695238 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.769811320754717, "acc_stderr": 0.025907897122408163, "acc_norm": 0.769811320754717, "acc_norm_stderr": 0.025907897122408163 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7708333333333334, "acc_stderr": 0.03514697467862388, "acc_norm": 0.7708333333333334, "acc_norm_stderr": 0.03514697467862388 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.6, "acc_stderr": 0.04923659639173309, "acc_norm": 0.6, "acc_norm_stderr": 0.04923659639173309 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.45, "acc_stderr": 0.04999999999999999, "acc_norm": 0.45, "acc_norm_stderr": 0.04999999999999999 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6820809248554913, "acc_stderr": 0.03550683989165579, "acc_norm": 0.6820809248554913, "acc_norm_stderr": 0.03550683989165579 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.38235294117647056, "acc_stderr": 0.04835503696107224, "acc_norm": 0.38235294117647056, "acc_norm_stderr": 0.04835503696107224 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932264, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932264 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6510638297872341, "acc_stderr": 0.03115852213135779, "acc_norm": 0.6510638297872341, "acc_norm_stderr": 0.03115852213135779 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.5877192982456141, "acc_stderr": 0.04630653203366596, "acc_norm": 0.5877192982456141, "acc_norm_stderr": 0.04630653203366596 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5862068965517241, "acc_stderr": 0.04104269211806232, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.04104269211806232 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47883597883597884, "acc_stderr": 0.025728230952130726, "acc_norm": 0.47883597883597884, "acc_norm_stderr": 0.025728230952130726 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5317460317460317, "acc_stderr": 0.04463112720677172, "acc_norm": 0.5317460317460317, "acc_norm_stderr": 0.04463112720677172 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.42, "acc_stderr": 0.049604496374885836, "acc_norm": 0.42, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8096774193548387, "acc_stderr": 0.022331707611823078, "acc_norm": 0.8096774193548387, "acc_norm_stderr": 0.022331707611823078 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5320197044334976, "acc_stderr": 0.03510766597959217, "acc_norm": 0.5320197044334976, "acc_norm_stderr": 0.03510766597959217 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.72, "acc_stderr": 0.04512608598542127, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542127 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.793939393939394, "acc_stderr": 0.03158415324047709, "acc_norm": 0.793939393939394, "acc_norm_stderr": 0.03158415324047709 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8333333333333334, "acc_stderr": 0.026552207828215286, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.026552207828215286 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9222797927461139, "acc_stderr": 0.019321805557223144, "acc_norm": 0.9222797927461139, "acc_norm_stderr": 0.019321805557223144 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.676923076923077, "acc_stderr": 0.02371088850197057, "acc_norm": 0.676923076923077, "acc_norm_stderr": 0.02371088850197057 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.31851851851851853, "acc_stderr": 0.028406533090608463, "acc_norm": 0.31851851851851853, "acc_norm_stderr": 0.028406533090608463 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7394957983193278, "acc_stderr": 0.028510251512341947, "acc_norm": 0.7394957983193278, "acc_norm_stderr": 0.028510251512341947 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.37748344370860926, "acc_stderr": 0.0395802723112157, "acc_norm": 0.37748344370860926, "acc_norm_stderr": 0.0395802723112157 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8550458715596331, "acc_stderr": 0.015094215699700469, "acc_norm": 0.8550458715596331, "acc_norm_stderr": 0.015094215699700469 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5231481481481481, "acc_stderr": 0.034063153607115086, "acc_norm": 0.5231481481481481, "acc_norm_stderr": 0.034063153607115086 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8284313725490197, "acc_stderr": 0.026460569561240644, "acc_norm": 0.8284313725490197, "acc_norm_stderr": 0.026460569561240644 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.869198312236287, "acc_stderr": 0.021948766059470756, "acc_norm": 0.869198312236287, "acc_norm_stderr": 0.021948766059470756 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7443946188340808, "acc_stderr": 0.029275891003969923, "acc_norm": 0.7443946188340808, "acc_norm_stderr": 0.029275891003969923 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.7938931297709924, "acc_stderr": 0.03547771004159464, "acc_norm": 0.7938931297709924, "acc_norm_stderr": 0.03547771004159464 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8677685950413223, "acc_stderr": 0.0309227883204458, "acc_norm": 0.8677685950413223, "acc_norm_stderr": 0.0309227883204458 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8055555555555556, "acc_stderr": 0.03826076324884866, "acc_norm": 0.8055555555555556, "acc_norm_stderr": 0.03826076324884866 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7914110429447853, "acc_stderr": 0.03192193448934723, "acc_norm": 0.7914110429447853, "acc_norm_stderr": 0.03192193448934723 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5267857142857143, "acc_stderr": 0.047389751192741546, "acc_norm": 0.5267857142857143, "acc_norm_stderr": 0.047389751192741546 }, "harness|hendrycksTest-management|5": { "acc": 0.8349514563106796, "acc_stderr": 0.03675668832233188, "acc_norm": 0.8349514563106796, "acc_norm_stderr": 0.03675668832233188 }, "harness|hendrycksTest-marketing|5": { "acc": 0.905982905982906, "acc_stderr": 0.01911989279892498, "acc_norm": 0.905982905982906, "acc_norm_stderr": 0.01911989279892498 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.76, "acc_stderr": 0.042923469599092816, "acc_norm": 0.76, "acc_norm_stderr": 0.042923469599092816 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8671775223499362, "acc_stderr": 0.012136303209884564, "acc_norm": 0.8671775223499362, "acc_norm_stderr": 0.012136303209884564 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7745664739884393, "acc_stderr": 0.02249723019096755, "acc_norm": 0.7745664739884393, "acc_norm_stderr": 0.02249723019096755 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.41787709497206704, "acc_stderr": 0.016495400635820084, "acc_norm": 0.41787709497206704, "acc_norm_stderr": 0.016495400635820084 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.738562091503268, "acc_stderr": 0.025160998214292456, "acc_norm": 0.738562091503268, "acc_norm_stderr": 0.025160998214292456 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7556270096463023, "acc_stderr": 0.024406162094668886, "acc_norm": 0.7556270096463023, "acc_norm_stderr": 0.024406162094668886 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7932098765432098, "acc_stderr": 0.022535006705942835, "acc_norm": 0.7932098765432098, "acc_norm_stderr": 0.022535006705942835 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5141843971631206, "acc_stderr": 0.02981549448368206, "acc_norm": 0.5141843971631206, "acc_norm_stderr": 0.02981549448368206 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5052151238591917, "acc_stderr": 0.012769541449652547, "acc_norm": 0.5052151238591917, "acc_norm_stderr": 0.012769541449652547 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6948529411764706, "acc_stderr": 0.0279715413701706, "acc_norm": 0.6948529411764706, "acc_norm_stderr": 0.0279715413701706 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7107843137254902, "acc_stderr": 0.018342529845275908, "acc_norm": 0.7107843137254902, "acc_norm_stderr": 0.018342529845275908 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6636363636363637, "acc_stderr": 0.04525393596302505, "acc_norm": 0.6636363636363637, "acc_norm_stderr": 0.04525393596302505 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7551020408163265, "acc_stderr": 0.02752963744017492, "acc_norm": 0.7551020408163265, "acc_norm_stderr": 0.02752963744017492 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8855721393034826, "acc_stderr": 0.022509345325101706, "acc_norm": 0.8855721393034826, "acc_norm_stderr": 0.022509345325101706 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.87, "acc_stderr": 0.033799766898963086, "acc_norm": 0.87, "acc_norm_stderr": 0.033799766898963086 }, "harness|hendrycksTest-virology|5": { "acc": 0.5180722891566265, "acc_stderr": 0.03889951252827216, "acc_norm": 0.5180722891566265, "acc_norm_stderr": 0.03889951252827216 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8713450292397661, "acc_stderr": 0.025679342723276915, "acc_norm": 0.8713450292397661, "acc_norm_stderr": 0.025679342723276915 }, "harness|truthfulqa:mc|0": { "mc1": 0.3818849449204406, "mc1_stderr": 0.017008101939163498, "mc2": 0.5498097510513819, "mc2_stderr": 0.015613516175450912 }, "harness|winogrande|5": { "acc": 0.8050513022888713, "acc_stderr": 0.011134099415938278 }, "harness|gsm8k|5": { "acc": 0.45564821834723274, "acc_stderr": 0.013718194542485601 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
camenduru/hands
--- dataset_info: features: - name: image dtype: image - name: text dtype: string splits: - name: train num_bytes: 890694993.116 num_examples: 11076 download_size: 695555524 dataset_size: 890694993.116 --- # Dataset Card for "hands" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2
--- pretty_name: Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [Swisslex/Mixtral-8x7b-DPO-v0.2](https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.2)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-16T18:18:43.502951](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2/blob/main/results_2024-01-16T18-18-43.502951.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7090305152663554,\n\ \ \"acc_stderr\": 0.030408745551927793,\n \"acc_norm\": 0.7130050743913563,\n\ \ \"acc_norm_stderr\": 0.03099893935186279,\n \"mc1\": 0.4394124847001224,\n\ \ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5869091785507645,\n\ \ \"mc2_stderr\": 0.01561392560307738\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6715017064846417,\n \"acc_stderr\": 0.013724978465537293,\n\ \ \"acc_norm\": 0.7039249146757679,\n \"acc_norm_stderr\": 0.01334091608524626\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6922923720374428,\n\ \ \"acc_stderr\": 0.004606015773125625,\n \"acc_norm\": 0.8773152758414658,\n\ \ \"acc_norm_stderr\": 0.0032740447231806155\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\ \ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\ \ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677098,\n\ \ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677098\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\ \ \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.71,\n \ \ \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.02619980880756192,\n\ \ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.02619980880756192\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\ \ \"acc_stderr\": 0.03063557897209328,\n \"acc_norm\": 0.8402777777777778,\n\ \ \"acc_norm_stderr\": 0.03063557897209328\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \ \ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\ acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\ : 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\ \ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\ \ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04975185951049946,\n \ \ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04975185951049946\n },\n\ \ \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n\ \ \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \ \ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610337,\n\ \ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610337\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6578947368421053,\n\ \ \"acc_stderr\": 0.044629175353369376,\n \"acc_norm\": 0.6578947368421053,\n\ \ \"acc_norm_stderr\": 0.044629175353369376\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.039215453124671215,\n\ \ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.039215453124671215\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346758,\n \"\ acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346758\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5476190476190477,\n\ \ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.5476190476190477,\n\ \ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8225806451612904,\n\ \ \"acc_stderr\": 0.02173254068932928,\n \"acc_norm\": 0.8225806451612904,\n\ \ \"acc_norm_stderr\": 0.02173254068932928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\ : {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.03395970381998575,\n\ \ \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.03395970381998575\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\ : 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n\ \ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"\ acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n\ \ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7051282051282052,\n \"acc_stderr\": 0.023119362758232297,\n\ \ \"acc_norm\": 0.7051282051282052,\n \"acc_norm_stderr\": 0.023119362758232297\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \ \ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.026265024608275882,\n\ \ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.026265024608275882\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\ acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8844036697247707,\n \"acc_stderr\": 0.01370874953417264,\n \"\ acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.01370874953417264\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"\ acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568624,\n \"\ acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568624\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878456,\n \ \ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878456\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7533632286995515,\n\ \ \"acc_stderr\": 0.028930413120910877,\n \"acc_norm\": 0.7533632286995515,\n\ \ \"acc_norm_stderr\": 0.028930413120910877\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476076,\n\ \ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476076\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\ acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.036028141763926456,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.036028141763926456\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.029634717272371037,\n\ \ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.029634717272371037\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\ \ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\ \ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\ \ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9188034188034188,\n\ \ \"acc_stderr\": 0.017893784904018533,\n \"acc_norm\": 0.9188034188034188,\n\ \ \"acc_norm_stderr\": 0.017893784904018533\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \ \ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.876117496807152,\n\ \ \"acc_stderr\": 0.011781017100950737,\n \"acc_norm\": 0.876117496807152,\n\ \ \"acc_norm_stderr\": 0.011781017100950737\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7861271676300579,\n \"acc_stderr\": 0.022075709251757183,\n\ \ \"acc_norm\": 0.7861271676300579,\n \"acc_norm_stderr\": 0.022075709251757183\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\ \ \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n\ \ \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340853,\n\ \ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340853\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\ \ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n\ \ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385714,\n\ \ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385714\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5390070921985816,\n \"acc_stderr\": 0.02973659252642444,\n \ \ \"acc_norm\": 0.5390070921985816,\n \"acc_norm_stderr\": 0.02973659252642444\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5221642764015645,\n\ \ \"acc_stderr\": 0.012757683047716184,\n \"acc_norm\": 0.5221642764015645,\n\ \ \"acc_norm_stderr\": 0.012757683047716184\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.025035845227711268,\n\ \ \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.025035845227711268\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7581699346405228,\n \"acc_stderr\": 0.017322789207784326,\n \ \ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.017322789207784326\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\ \ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \ \ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399687,\n\ \ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399687\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\ \ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\ \ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \ \ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\ \ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\ \ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\ \ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n\ \ \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5869091785507645,\n\ \ \"mc2_stderr\": 0.01561392560307738\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.01066518790249844\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5754359363153905,\n \ \ \"acc_stderr\": 0.013614835574956387\n }\n}\n```" repo_url: https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|arc:challenge|25_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-16T18-18-43.502951.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|gsm8k|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hellaswag|10_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T18-18-43.502951.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|truthfulqa:mc|0_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-16T18-18-43.502951.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_16T18_18_43.502951 path: - '**/details_harness|winogrande|5_2024-01-16T18-18-43.502951.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-16T18-18-43.502951.parquet' - config_name: results data_files: - split: 2024_01_16T18_18_43.502951 path: - results_2024-01-16T18-18-43.502951.parquet - split: latest path: - results_2024-01-16T18-18-43.502951.parquet --- # Dataset Card for Evaluation run of Swisslex/Mixtral-8x7b-DPO-v0.2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [Swisslex/Mixtral-8x7b-DPO-v0.2](https://huggingface.co/Swisslex/Mixtral-8x7b-DPO-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-16T18:18:43.502951](https://huggingface.co/datasets/open-llm-leaderboard/details_Swisslex__Mixtral-8x7b-DPO-v0.2/blob/main/results_2024-01-16T18-18-43.502951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.7090305152663554, "acc_stderr": 0.030408745551927793, "acc_norm": 0.7130050743913563, "acc_norm_stderr": 0.03099893935186279, "mc1": 0.4394124847001224, "mc1_stderr": 0.017374520482513707, "mc2": 0.5869091785507645, "mc2_stderr": 0.01561392560307738 }, "harness|arc:challenge|25": { "acc": 0.6715017064846417, "acc_stderr": 0.013724978465537293, "acc_norm": 0.7039249146757679, "acc_norm_stderr": 0.01334091608524626 }, "harness|hellaswag|10": { "acc": 0.6922923720374428, "acc_stderr": 0.004606015773125625, "acc_norm": 0.8773152758414658, "acc_norm_stderr": 0.0032740447231806155 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.36, "acc_stderr": 0.04824181513244218, "acc_norm": 0.36, "acc_norm_stderr": 0.04824181513244218 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6666666666666666, "acc_stderr": 0.04072314811876837, "acc_norm": 0.6666666666666666, "acc_norm_stderr": 0.04072314811876837 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8289473684210527, "acc_stderr": 0.030643607071677098, "acc_norm": 0.8289473684210527, "acc_norm_stderr": 0.030643607071677098 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.71, "acc_stderr": 0.04560480215720683, "acc_norm": 0.71, "acc_norm_stderr": 0.04560480215720683 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7622641509433963, "acc_stderr": 0.02619980880756192, "acc_norm": 0.7622641509433963, "acc_norm_stderr": 0.02619980880756192 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8402777777777778, "acc_stderr": 0.03063557897209328, "acc_norm": 0.8402777777777778, "acc_norm_stderr": 0.03063557897209328 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.58, "acc_stderr": 0.049604496374885836, "acc_norm": 0.58, "acc_norm_stderr": 0.049604496374885836 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6936416184971098, "acc_stderr": 0.035149425512674394, "acc_norm": 0.6936416184971098, "acc_norm_stderr": 0.035149425512674394 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5, "acc_stderr": 0.04975185951049946, "acc_norm": 0.5, "acc_norm_stderr": 0.04975185951049946 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.79, "acc_stderr": 0.04093601807403326, "acc_norm": 0.79, "acc_norm_stderr": 0.04093601807403326 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.676595744680851, "acc_stderr": 0.030579442773610337, "acc_norm": 0.676595744680851, "acc_norm_stderr": 0.030579442773610337 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6578947368421053, "acc_stderr": 0.044629175353369376, "acc_norm": 0.6578947368421053, "acc_norm_stderr": 0.044629175353369376 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.6689655172413793, "acc_stderr": 0.039215453124671215, "acc_norm": 0.6689655172413793, "acc_norm_stderr": 0.039215453124671215 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.47354497354497355, "acc_stderr": 0.025715239811346758, "acc_norm": 0.47354497354497355, "acc_norm_stderr": 0.025715239811346758 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5476190476190477, "acc_stderr": 0.044518079590553275, "acc_norm": 0.5476190476190477, "acc_norm_stderr": 0.044518079590553275 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8225806451612904, "acc_stderr": 0.02173254068932928, "acc_norm": 0.8225806451612904, "acc_norm_stderr": 0.02173254068932928 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.6305418719211823, "acc_stderr": 0.03395970381998575, "acc_norm": 0.6305418719211823, "acc_norm_stderr": 0.03395970381998575 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.74, "acc_stderr": 0.04408440022768078, "acc_norm": 0.74, "acc_norm_stderr": 0.04408440022768078 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.806060606060606, "acc_stderr": 0.03087414513656208, "acc_norm": 0.806060606060606, "acc_norm_stderr": 0.03087414513656208 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8838383838383839, "acc_stderr": 0.022828881775249377, "acc_norm": 0.8838383838383839, "acc_norm_stderr": 0.022828881775249377 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9222797927461139, "acc_stderr": 0.019321805557223157, "acc_norm": 0.9222797927461139, "acc_norm_stderr": 0.019321805557223157 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7051282051282052, "acc_stderr": 0.023119362758232297, "acc_norm": 0.7051282051282052, "acc_norm_stderr": 0.023119362758232297 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.02938162072646507, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.02938162072646507 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.7941176470588235, "acc_stderr": 0.026265024608275882, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.026265024608275882 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.47019867549668876, "acc_stderr": 0.040752249922169775, "acc_norm": 0.47019867549668876, "acc_norm_stderr": 0.040752249922169775 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8844036697247707, "acc_stderr": 0.01370874953417264, "acc_norm": 0.8844036697247707, "acc_norm_stderr": 0.01370874953417264 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5972222222222222, "acc_stderr": 0.03344887382997865, "acc_norm": 0.5972222222222222, "acc_norm_stderr": 0.03344887382997865 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8578431372549019, "acc_stderr": 0.024509803921568624, "acc_norm": 0.8578431372549019, "acc_norm_stderr": 0.024509803921568624 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8818565400843882, "acc_stderr": 0.021011052659878456, "acc_norm": 0.8818565400843882, "acc_norm_stderr": 0.021011052659878456 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7533632286995515, "acc_stderr": 0.028930413120910877, "acc_norm": 0.7533632286995515, "acc_norm_stderr": 0.028930413120910877 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8244274809160306, "acc_stderr": 0.03336820338476076, "acc_norm": 0.8244274809160306, "acc_norm_stderr": 0.03336820338476076 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8347107438016529, "acc_stderr": 0.03390780612972776, "acc_norm": 0.8347107438016529, "acc_norm_stderr": 0.03390780612972776 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8333333333333334, "acc_stderr": 0.036028141763926456, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.036028141763926456 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.8282208588957055, "acc_stderr": 0.029634717272371037, "acc_norm": 0.8282208588957055, "acc_norm_stderr": 0.029634717272371037 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.5446428571428571, "acc_stderr": 0.04726835553719098, "acc_norm": 0.5446428571428571, "acc_norm_stderr": 0.04726835553719098 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.034926064766237906, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.034926064766237906 }, "harness|hendrycksTest-marketing|5": { "acc": 0.9188034188034188, "acc_stderr": 0.017893784904018533, "acc_norm": 0.9188034188034188, "acc_norm_stderr": 0.017893784904018533 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.8, "acc_stderr": 0.040201512610368445, "acc_norm": 0.8, "acc_norm_stderr": 0.040201512610368445 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.876117496807152, "acc_stderr": 0.011781017100950737, "acc_norm": 0.876117496807152, "acc_norm_stderr": 0.011781017100950737 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7861271676300579, "acc_stderr": 0.022075709251757183, "acc_norm": 0.7861271676300579, "acc_norm_stderr": 0.022075709251757183 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.43687150837988825, "acc_stderr": 0.016588680864530626, "acc_norm": 0.43687150837988825, "acc_norm_stderr": 0.016588680864530626 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7679738562091504, "acc_stderr": 0.024170840879340853, "acc_norm": 0.7679738562091504, "acc_norm_stderr": 0.024170840879340853 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7813504823151125, "acc_stderr": 0.02347558141786111, "acc_norm": 0.7813504823151125, "acc_norm_stderr": 0.02347558141786111 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8425925925925926, "acc_stderr": 0.020263764996385714, "acc_norm": 0.8425925925925926, "acc_norm_stderr": 0.020263764996385714 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5390070921985816, "acc_stderr": 0.02973659252642444, "acc_norm": 0.5390070921985816, "acc_norm_stderr": 0.02973659252642444 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5221642764015645, "acc_stderr": 0.012757683047716184, "acc_norm": 0.5221642764015645, "acc_norm_stderr": 0.012757683047716184 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7830882352941176, "acc_stderr": 0.025035845227711268, "acc_norm": 0.7830882352941176, "acc_norm_stderr": 0.025035845227711268 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7581699346405228, "acc_stderr": 0.017322789207784326, "acc_norm": 0.7581699346405228, "acc_norm_stderr": 0.017322789207784326 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.746938775510204, "acc_stderr": 0.027833023871399687, "acc_norm": 0.746938775510204, "acc_norm_stderr": 0.027833023871399687 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8756218905472637, "acc_stderr": 0.023335401790166327, "acc_norm": 0.8756218905472637, "acc_norm_stderr": 0.023335401790166327 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.89, "acc_stderr": 0.03144660377352203, "acc_norm": 0.89, "acc_norm_stderr": 0.03144660377352203 }, "harness|hendrycksTest-virology|5": { "acc": 0.5301204819277109, "acc_stderr": 0.03885425420866767, "acc_norm": 0.5301204819277109, "acc_norm_stderr": 0.03885425420866767 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8888888888888888, "acc_stderr": 0.024103384202072864, "acc_norm": 0.8888888888888888, "acc_norm_stderr": 0.024103384202072864 }, "harness|truthfulqa:mc|0": { "mc1": 0.4394124847001224, "mc1_stderr": 0.017374520482513707, "mc2": 0.5869091785507645, "mc2_stderr": 0.01561392560307738 }, "harness|winogrande|5": { "acc": 0.8255722178374112, "acc_stderr": 0.01066518790249844 }, "harness|gsm8k|5": { "acc": 0.5754359363153905, "acc_stderr": 0.013614835574956387 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
AppleHarem/le_malin_azurlane
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of le_malin (Azur Lane) This is the dataset of le_malin (Azur Lane), containing 200 images and their tags. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI)) | Name | Images | Download | Description | |:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------| | raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. | | raw-stage3 | 530 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. | | raw-stage3-eyes | 598 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. | | 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. | | 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. | | 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. | | stage3-640 | 530 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. | | stage3-800 | 530 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. | | stage3-p512-640 | 347 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. | | stage3-eyes-640 | 598 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. | | stage3-eyes-800 | 598 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
AkitoP/Hscene-Speech
--- license: apache-2.0 task_categories: - text-to-speech - automatic-speech-recognition language: - ja size_categories: - 100K<n<1M --- # This dataset contains NSFW content. Please use with caution. # Please Contact Us for Access to the Dataset ## Changelog ``` - 2024-04-09 Initial release ``` This dataset is only permitted for use for data analysis (such as machine learning) purposes. Any use for purposes other than those specified is prohibited. ## Dataset Language Japanese ## Dataset Information This dataset is a medium-quality dataset of character acting speech audio by Japanese dou-jin voice actors. The initial release contains **835736 files**, **2235.72 hours** of audio. The dataset is in Japanese. Audio files are in their original format, and the dataset is provided as-is. Most of the audio files are 44.1 kHz or 48 kHz, while some are 22.05 kHz. The dataset is provided with zip compression. ## Dataset Structure ``` ├── transcript.csv ├── dataset │ ├── (files) ``` ## Ttranscript The transcript file contains the transcript of the audio files in the dataset. The format is: ``` filename,chara,duration,samplerate,transcript ``` ## Dataset Analysis Duration: ![](./duration_distribution.png)
felipesampaio/cantores
--- license: openrail ---
Unusualhackero2/AtifAslamHighQualityNewVoiceAiModel
--- license: apache-2.0 ---
MouhsineGT/dataset_new_fr_V2
--- license: unknown ---
Nerfgun3/ao_style
--- language: - en tags: - stable-diffusion - text-to-image license: creativeml-openrail-m inference: false --- # Ao Artist Embedding / Textual Inversion ## Usage To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder To use it in a prompt: ```"drawn by ao_style"``` If it is to strong just add [] around it. Trained until 10000 steps I added a 7.5k steps trained ver in the files aswell. If you want to use that version, remove the ```"-7500"``` from the file name and replace the 10k steps ver in your folder Have fun :) ## Example Pictures <table> <tr> <td><img src=https://i.imgur.com/ec8MaO4.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/N4IRulK.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/22alJny.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/ZPPIs9L.png width=100% height=100%/></td> <td><img src=https://i.imgur.com/XQZvjGs.png width=100% height=100%/></td> </tr> </table> ## License This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies: 1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content 2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license 3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) [Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license)
ShoukanLabs/OpenNiji-415001_450000
--- dataset_info: features: - name: image dtype: image - name: url dtype: string - name: prompt dtype: string - name: style dtype: string splits: - name: train num_bytes: 49549899819.051 num_examples: 34999 download_size: 54547926257 dataset_size: 49549899819.051 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "OpenNiji-415001_450000" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
EinfachOlder/testen
--- license: mit ---
skatemonke/bartek
--- license: unknown ---
jamestalentium/cnn_dailymail_100_rm
--- dataset_info: features: - name: input_text dtype: string - name: output_text dtype: string - name: id dtype: string splits: - name: train num_bytes: 439445.02164652944 num_examples: 100 download_size: 134076 dataset_size: 439445.02164652944 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "cnn_dailymail_100_rm" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
edarchimbaud/targets-monthly-sp500
--- dataset_info: features: - name: symbol dtype: string - name: date dtype: timestamp[ns] - name: return dtype: float64 - name: return_quintile dtype: int64 splits: - name: train num_bytes: 5885431 num_examples: 189255 download_size: 2997509 dataset_size: 5885431 --- # Dataset Card for "targets-monthly-sp500" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
liuyanchen1015/MULTI_VALUE_mrpc_possessives_for_post
--- dataset_info: features: - name: sentence1 dtype: string - name: sentence2 dtype: string - name: label dtype: int64 - name: idx dtype: int64 - name: value_score dtype: int64 splits: - name: test num_bytes: 214303 num_examples: 741 - name: train num_bytes: 434797 num_examples: 1493 - name: validation num_bytes: 54180 num_examples: 184 download_size: 460209 dataset_size: 703280 --- # Dataset Card for "MULTI_VALUE_mrpc_possessives_for_post" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Cohere/miracl-yo-corpus-22-12
--- annotations_creators: - expert-generated language: - yo multilinguality: - multilingual size_categories: [] source_datasets: [] tags: [] task_categories: - text-retrieval license: - apache-2.0 task_ids: - document-retrieval --- # MIRACL (yo) embedded with cohere.ai `multilingual-22-12` encoder We encoded the [MIRACL dataset](https://huggingface.co/miracl) using the [cohere.ai](https://txt.cohere.ai/multilingual/) `multilingual-22-12` embedding model. The query embeddings can be found in [Cohere/miracl-yo-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-yo-queries-22-12) and the corpus embeddings can be found in [Cohere/miracl-yo-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-yo-corpus-22-12). For the orginal datasets, see [miracl/miracl](https://huggingface.co/datasets/miracl/miracl) and [miracl/miracl-corpus](https://huggingface.co/datasets/miracl/miracl-corpus). Dataset info: > MIRACL 🌍🙌🌏 (Multilingual Information Retrieval Across a Continuum of Languages) is a multilingual retrieval dataset that focuses on search across 18 different languages, which collectively encompass over three billion native speakers around the world. > > The corpus for each language is prepared from a Wikipedia dump, where we keep only the plain text and discard images, tables, etc. Each article is segmented into multiple passages using WikiExtractor based on natural discourse units (e.g., `\n\n` in the wiki markup). Each of these passages comprises a "document" or unit of retrieval. We preserve the Wikipedia article title of each passage. ## Embeddings We compute for `title+" "+text` the embeddings using our `multilingual-22-12` embedding model, a state-of-the-art model that works for semantic search in 100 languages. If you want to learn more about this model, have a look at [cohere.ai multilingual embedding model](https://txt.cohere.ai/multilingual/). ## Loading the dataset In [miracl-yo-corpus-22-12](https://huggingface.co/datasets/Cohere/miracl-yo-corpus-22-12) we provide the corpus embeddings. Note, depending on the selected split, the respective files can be quite large. You can either load the dataset like this: ```python from datasets import load_dataset docs = load_dataset(f"Cohere/miracl-yo-corpus-22-12", split="train") ``` Or you can also stream it without downloading it before: ```python from datasets import load_dataset docs = load_dataset(f"Cohere/miracl-yo-corpus-22-12", split="train", streaming=True) for doc in docs: docid = doc['docid'] title = doc['title'] text = doc['text'] emb = doc['emb'] ``` ## Search Have a look at [miracl-yo-queries-22-12](https://huggingface.co/datasets/Cohere/miracl-yo-queries-22-12) where we provide the query embeddings for the MIRACL dataset. To search in the documents, you must use **dot-product**. And then compare this query embeddings either with a vector database (recommended) or directly computing the dot product. A full search example: ```python # Attention! For large datasets, this requires a lot of memory to store # all document embeddings and to compute the dot product scores. # Only use this for smaller datasets. For large datasets, use a vector DB from datasets import load_dataset import torch #Load documents + embeddings docs = load_dataset(f"Cohere/miracl-yo-corpus-22-12", split="train") doc_embeddings = torch.tensor(docs['emb']) # Load queries queries = load_dataset(f"Cohere/miracl-yo-queries-22-12", split="dev") # Select the first query as example qid = 0 query = queries[qid] query_embedding = torch.tensor(queries['emb']) # Compute dot score between query embedding and document embeddings dot_scores = torch.mm(query_embedding, doc_embeddings.transpose(0, 1)) top_k = torch.topk(dot_scores, k=3) # Print results print("Query:", query['query']) for doc_id in top_k.indices[0].tolist(): print(docs[doc_id]['title']) print(docs[doc_id]['text']) ``` You can get embeddings for new queries using our API: ```python #Run: pip install cohere import cohere co = cohere.Client(f"{api_key}") # You should add your cohere API Key here :)) texts = ['my search query'] response = co.embed(texts=texts, model='multilingual-22-12') query_embedding = response.embeddings[0] # Get the embedding for the first text ``` ## Performance In the following table we compare the cohere multilingual-22-12 model with Elasticsearch version 8.6.0 lexical search (title and passage indexed as independent fields). Note that Elasticsearch doesn't support all languages that are part of the MIRACL dataset. We compute nDCG@10 (a ranking based loss), as well as hit@3: Is at least one relevant document in the top-3 results. We find that hit@3 is easier to interpret, as it presents the number of queries for which a relevant document is found among the top-3 results. Note: MIRACL only annotated a small fraction of passages (10 per query) for relevancy. Especially for larger Wikipedias (like English), we often found many more relevant passages. This is know as annotation holes. Real nDCG@10 and hit@3 performance is likely higher than depicted. | Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | ES 8.6.0 nDCG@10 | ES 8.6.0 acc@3 | |---|---|---|---|---| | miracl-ar | 64.2 | 75.2 | 46.8 | 56.2 | | miracl-bn | 61.5 | 75.7 | 49.2 | 60.1 | | miracl-de | 44.4 | 60.7 | 19.6 | 29.8 | | miracl-en | 44.6 | 62.2 | 30.2 | 43.2 | | miracl-es | 47.0 | 74.1 | 27.0 | 47.2 | | miracl-fi | 63.7 | 76.2 | 51.4 | 61.6 | | miracl-fr | 46.8 | 57.1 | 17.0 | 21.6 | | miracl-hi | 50.7 | 62.9 | 41.0 | 48.9 | | miracl-id | 44.8 | 63.8 | 39.2 | 54.7 | | miracl-ru | 49.2 | 66.9 | 25.4 | 36.7 | | **Avg** | 51.7 | 67.5 | 34.7 | 46.0 | Further languages (not supported by Elasticsearch): | Model | cohere multilingual-22-12 nDCG@10 | cohere multilingual-22-12 hit@3 | |---|---|---| | miracl-fa | 44.8 | 53.6 | | miracl-ja | 49.0 | 61.0 | | miracl-ko | 50.9 | 64.8 | | miracl-sw | 61.4 | 74.5 | | miracl-te | 67.8 | 72.3 | | miracl-th | 60.2 | 71.9 | | miracl-yo | 56.4 | 62.2 | | miracl-zh | 43.8 | 56.5 | | **Avg** | 54.3 | 64.6 |
fastjt/fasst
--- license: afl-3.0 ---
noisy-alpaca-test/MUSAN-music
--- configs: - config_name: default data_files: - split: test path: data/test-* dataset_info: features: - name: speech_input dtype: string - name: clean_audio dtype: audio - name: noisy_10dB dtype: audio - name: noisy_5dB dtype: audio - name: noisy_0dB dtype: audio - name: noisy_-5dB dtype: audio - name: noisy_-10dB dtype: audio - name: noisy_-20dB dtype: audio splits: - name: test num_bytes: 6795900010.1 num_examples: 5135 download_size: 6668227831 dataset_size: 6795900010.1 --- # Dataset Card for "MUSAN-music" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
52AI/TinyStoriesZh
--- license: mit --- LM朝着越来越大的方向卷,而在小LM的方向,有研究者在探索小LM方向的边界能力,比如想知道多小的语言模型仍然能流畅的说故事? [TinyStories](https://huggingface.co/datasets/roneneldan/TinyStories) 是在其做该方向时使用的一份关于小故事的场景数据。故事是由研究者使用GPT3.5, GPT4生成的,并且将故事难度限制在3~4岁小朋友能理解。 这份中文数据通过[翻译器](https://pypi.org/project/deep-translator/)将英文故事数据翻译而成。如下例子。 > Lily and Ben are friends. They like to play in the park. One day, they see a big tree with a swing. Lily wants to try the swing. She runs to the tree and climbs on the swing.\n"Push me, Ben!" she says. Ben pushes her gently. Lily feels happy. She swings higher and higher. She laughs and shouts.\nBen watches Lily. He thinks she is cute. He wants to swing too. He waits for Lily to stop. But Lily does not stop. She swings faster and faster. She is having too much fun.\n"Can I swing too, Lily?" Ben asks. Lily does not hear him. She is too busy swinging. Ben feels sad. He walks away.\nLily swings so high that she loses her grip. She falls off the swing. She lands on the ground. She hurts her foot. She cries.\n"Ow, ow, ow!" she says. She looks for Ben. She wants him to help her. But Ben is not there. He is gone.\nLily feels sorry. She wishes she had shared the swing with Ben. She wishes he was there to hug her. She limps to the tree. She sees something hanging from a branch. It is Ben\'s hat. He left it for her.\nLily smiles. She thinks Ben is nice. She puts on his hat. She hopes he will come back. She wants to say sorry. She wants to be friends again. > 莉莉和本是朋友。他们喜欢在公园里玩。有一天,他们看到一棵有秋千的大树。莉莉想尝试秋千。她跑到树旁,爬上秋千。\n“推我吧,本!”她说。本轻轻地推了她一下。莉莉感觉很幸福。她荡得越来越高。她又笑又叫。\n本看着莉莉。他觉得她很可爱。他也想摇摆。他等着莉莉停下来。但莉莉并没有停下来。她摆动得越来越快。她玩得太开心了。\n“我也可以荡秋千吗,莉莉?”本问。莉莉没有听见他的话。她正忙着荡秋千。本感到难过。他走开了。\n莉莉荡得太高,以至于她失去了抓力。她从秋千上摔下来。她降落在地上。她的脚受伤了。她哭了。\n“呜呜呜!”她说。她寻找本。她想要他帮助她。但本不在那儿。他已经去了。\n莉莉感到抱歉。她希望自己能和本一起荡秋千。她希望他能在那里拥抱她。她一瘸一拐地走向树。她看到树枝上挂着什么东西。这是本的帽子。他留给她了。\n莉莉微笑着。她认为本很好。她戴上他的帽子。她希望他能回来。她想说对不起。她想再次成为朋友。
sm-esgstudier/redditStocks_last1000
--- license: mit tags: - finance --- # Dataset Card for Dataset Name ## Dataset Description - **Homepage:** - **Repository:** - **Paper:** - **Leaderboard:** - **Point of Contact:** ### Dataset Summary This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1). ### Supported Tasks and Leaderboards [More Information Needed] ### Languages English [More Information Needed] ## Dataset Structure ### Data Instances [More Information Needed] ### Data Fields [More Information Needed] ### Data Splits [More Information Needed] ## Dataset Creation ### Curation Rationale [More Information Needed] ### Source Data #### Initial Data Collection and Normalization [More Information Needed] #### Who are the source language producers? [More Information Needed] ### Annotations #### Annotation process [More Information Needed] #### Who are the annotators? [More Information Needed] ### Personal and Sensitive Information [More Information Needed] ## Considerations for Using the Data ### Social Impact of Dataset [More Information Needed] ### Discussion of Biases [More Information Needed] ### Other Known Limitations [More Information Needed] ## Additional Information ### Dataset Curators [More Information Needed] ### Licensing Information [More Information Needed] ### Citation Information [More Information Needed] ### Contributions [More Information Needed]
djmix/beats-mixes
--- dataset_info: features: - name: mix_id dtype: string - name: beats sequence: float64 splits: - name: train num_bytes: 425961256 num_examples: 5040 download_size: 244903841 dataset_size: 425961256 --- # Dataset Card for "beats-mixes" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
DAVI-AI/test-traduction
--- dataset_info: features: - name: phrase_fr dtype: string - name: phrase_lsf dtype: string splits: - name: train num_bytes: 87712 num_examples: 678 download_size: 29182 dataset_size: 87712 configs: - config_name: default data_files: - split: train path: data/train-* ---
dim/dolphin_flan1m_alpaca_uncensored_3k
--- dataset_info: features: - name: instruction dtype: string - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 5235792.840107775 num_examples: 3000 download_size: 2954863 dataset_size: 5235792.840107775 --- # Dataset Card for "dolphin_flan1m_alpaca_uncensored_3k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
huggingface-projects/DELETE-temp-match-results
--- license: mit ---
ikwak/test
--- license: apache-2.0 ---
ThePandaKing94/SaveHumanityGPT
--- license: mit ---
distilled-one-sec-cv12-each-chunk-uniq/chunk_65
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1209987036.0 num_examples: 235773 download_size: 1235314857 dataset_size: 1209987036.0 --- # Dataset Card for "chunk_65" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
htetkhinekyaw/dailypyaz
--- license: openrail ---
irds/mmarco_v2_hi_dev
--- pretty_name: '`mmarco/v2/hi/dev`' viewer: false source_datasets: ['irds/mmarco_v2_hi'] task_categories: - text-retrieval --- # Dataset Card for `mmarco/v2/hi/dev` The `mmarco/v2/hi/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package. For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/v2/hi/dev). # Data This dataset provides: - `queries` (i.e., topics); count=101,093 - `qrels`: (relevance assessments); count=59,273 - For `docs`, use [`irds/mmarco_v2_hi`](https://huggingface.co/datasets/irds/mmarco_v2_hi) ## Usage ```python from datasets import load_dataset queries = load_dataset('irds/mmarco_v2_hi_dev', 'queries') for record in queries: record # {'query_id': ..., 'text': ...} qrels = load_dataset('irds/mmarco_v2_hi_dev', 'qrels') for record in qrels: record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...} ``` Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the data in 🤗 Dataset format. ## Citation Information ``` @article{Bonifacio2021MMarco, title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset}, author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira}, year={2021}, journal={arXiv:2108.13897} } ```
legwyn/temp_ds_aha
--- license: mit ---
BangumiBase/renaiflops
--- license: mit tags: - art size_categories: - 1K<n<10K --- # Bangumi Image Base of Ren`ai Flops This is the image base of bangumi Ren`ai Flops, we detected 19 characters, 1980 images in total. The full dataset is [here](all.zip). **Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability). Here is the characters' preview: | # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 | |:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------| | 0 | 714 | [Download](0/dataset.zip) | ![preview 1](0/preview_1.png) | ![preview 2](0/preview_2.png) | ![preview 3](0/preview_3.png) | ![preview 4](0/preview_4.png) | ![preview 5](0/preview_5.png) | ![preview 6](0/preview_6.png) | ![preview 7](0/preview_7.png) | ![preview 8](0/preview_8.png) | | 1 | 182 | [Download](1/dataset.zip) | ![preview 1](1/preview_1.png) | ![preview 2](1/preview_2.png) | ![preview 3](1/preview_3.png) | ![preview 4](1/preview_4.png) | ![preview 5](1/preview_5.png) | ![preview 6](1/preview_6.png) | ![preview 7](1/preview_7.png) | ![preview 8](1/preview_8.png) | | 2 | 10 | [Download](2/dataset.zip) | ![preview 1](2/preview_1.png) | ![preview 2](2/preview_2.png) | ![preview 3](2/preview_3.png) | ![preview 4](2/preview_4.png) | ![preview 5](2/preview_5.png) | ![preview 6](2/preview_6.png) | ![preview 7](2/preview_7.png) | ![preview 8](2/preview_8.png) | | 3 | 8 | [Download](3/dataset.zip) | ![preview 1](3/preview_1.png) | ![preview 2](3/preview_2.png) | ![preview 3](3/preview_3.png) | ![preview 4](3/preview_4.png) | ![preview 5](3/preview_5.png) | ![preview 6](3/preview_6.png) | ![preview 7](3/preview_7.png) | ![preview 8](3/preview_8.png) | | 4 | 13 | [Download](4/dataset.zip) | ![preview 1](4/preview_1.png) | ![preview 2](4/preview_2.png) | ![preview 3](4/preview_3.png) | ![preview 4](4/preview_4.png) | ![preview 5](4/preview_5.png) | ![preview 6](4/preview_6.png) | ![preview 7](4/preview_7.png) | ![preview 8](4/preview_8.png) | | 5 | 19 | [Download](5/dataset.zip) | ![preview 1](5/preview_1.png) | ![preview 2](5/preview_2.png) | ![preview 3](5/preview_3.png) | ![preview 4](5/preview_4.png) | ![preview 5](5/preview_5.png) | ![preview 6](5/preview_6.png) | ![preview 7](5/preview_7.png) | ![preview 8](5/preview_8.png) | | 6 | 170 | [Download](6/dataset.zip) | ![preview 1](6/preview_1.png) | ![preview 2](6/preview_2.png) | ![preview 3](6/preview_3.png) | ![preview 4](6/preview_4.png) | ![preview 5](6/preview_5.png) | ![preview 6](6/preview_6.png) | ![preview 7](6/preview_7.png) | ![preview 8](6/preview_8.png) | | 7 | 95 | [Download](7/dataset.zip) | ![preview 1](7/preview_1.png) | ![preview 2](7/preview_2.png) | ![preview 3](7/preview_3.png) | ![preview 4](7/preview_4.png) | ![preview 5](7/preview_5.png) | ![preview 6](7/preview_6.png) | ![preview 7](7/preview_7.png) | ![preview 8](7/preview_8.png) | | 8 | 47 | [Download](8/dataset.zip) | ![preview 1](8/preview_1.png) | ![preview 2](8/preview_2.png) | ![preview 3](8/preview_3.png) | ![preview 4](8/preview_4.png) | ![preview 5](8/preview_5.png) | ![preview 6](8/preview_6.png) | ![preview 7](8/preview_7.png) | ![preview 8](8/preview_8.png) | | 9 | 101 | [Download](9/dataset.zip) | ![preview 1](9/preview_1.png) | ![preview 2](9/preview_2.png) | ![preview 3](9/preview_3.png) | ![preview 4](9/preview_4.png) | ![preview 5](9/preview_5.png) | ![preview 6](9/preview_6.png) | ![preview 7](9/preview_7.png) | ![preview 8](9/preview_8.png) | | 10 | 197 | [Download](10/dataset.zip) | ![preview 1](10/preview_1.png) | ![preview 2](10/preview_2.png) | ![preview 3](10/preview_3.png) | ![preview 4](10/preview_4.png) | ![preview 5](10/preview_5.png) | ![preview 6](10/preview_6.png) | ![preview 7](10/preview_7.png) | ![preview 8](10/preview_8.png) | | 11 | 42 | [Download](11/dataset.zip) | ![preview 1](11/preview_1.png) | ![preview 2](11/preview_2.png) | ![preview 3](11/preview_3.png) | ![preview 4](11/preview_4.png) | ![preview 5](11/preview_5.png) | ![preview 6](11/preview_6.png) | ![preview 7](11/preview_7.png) | ![preview 8](11/preview_8.png) | | 12 | 8 | [Download](12/dataset.zip) | ![preview 1](12/preview_1.png) | ![preview 2](12/preview_2.png) | ![preview 3](12/preview_3.png) | ![preview 4](12/preview_4.png) | ![preview 5](12/preview_5.png) | ![preview 6](12/preview_6.png) | ![preview 7](12/preview_7.png) | ![preview 8](12/preview_8.png) | | 13 | 74 | [Download](13/dataset.zip) | ![preview 1](13/preview_1.png) | ![preview 2](13/preview_2.png) | ![preview 3](13/preview_3.png) | ![preview 4](13/preview_4.png) | ![preview 5](13/preview_5.png) | ![preview 6](13/preview_6.png) | ![preview 7](13/preview_7.png) | ![preview 8](13/preview_8.png) | | 14 | 169 | [Download](14/dataset.zip) | ![preview 1](14/preview_1.png) | ![preview 2](14/preview_2.png) | ![preview 3](14/preview_3.png) | ![preview 4](14/preview_4.png) | ![preview 5](14/preview_5.png) | ![preview 6](14/preview_6.png) | ![preview 7](14/preview_7.png) | ![preview 8](14/preview_8.png) | | 15 | 6 | [Download](15/dataset.zip) | ![preview 1](15/preview_1.png) | ![preview 2](15/preview_2.png) | ![preview 3](15/preview_3.png) | ![preview 4](15/preview_4.png) | ![preview 5](15/preview_5.png) | ![preview 6](15/preview_6.png) | N/A | N/A | | 16 | 7 | [Download](16/dataset.zip) | ![preview 1](16/preview_1.png) | ![preview 2](16/preview_2.png) | ![preview 3](16/preview_3.png) | ![preview 4](16/preview_4.png) | ![preview 5](16/preview_5.png) | ![preview 6](16/preview_6.png) | ![preview 7](16/preview_7.png) | N/A | | 17 | 6 | [Download](17/dataset.zip) | ![preview 1](17/preview_1.png) | ![preview 2](17/preview_2.png) | ![preview 3](17/preview_3.png) | ![preview 4](17/preview_4.png) | ![preview 5](17/preview_5.png) | ![preview 6](17/preview_6.png) | N/A | N/A | | noise | 112 | [Download](-1/dataset.zip) | ![preview 1](-1/preview_1.png) | ![preview 2](-1/preview_2.png) | ![preview 3](-1/preview_3.png) | ![preview 4](-1/preview_4.png) | ![preview 5](-1/preview_5.png) | ![preview 6](-1/preview_6.png) | ![preview 7](-1/preview_7.png) | ![preview 8](-1/preview_8.png) |
LEAP/ClimSim_high-res
--- license: cc-by-4.0 --- The corresponding GitHub repo can be found here:https://github.com/leap-stc/ClimSim Read more: https://arxiv.org/abs/2306.08754.
tanvirsrbd1/srbd-test1-1_annotated_segmented
--- dataset_info: features: - name: html dtype: string - name: response dtype: string splits: - name: train num_bytes: 1837883 num_examples: 2980 download_size: 607662 dataset_size: 1837883 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "srbd-test1-1_annotated_segmented" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
eneskadumi/turkishReviews-ds-mini
--- dataset_info: features: - name: review dtype: string - name: review_length dtype: int64 splits: - name: train num_bytes: 1252876.2642514652 num_examples: 3378 - name: validation num_bytes: 139455.7357485349 num_examples: 376 download_size: 896651 dataset_size: 1392332.0 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* ---
aureliojafer/twitter_dataset_1709851649
--- dataset_info: features: - name: id dtype: string - name: tweet_content dtype: string - name: user_name dtype: string - name: user_id dtype: string - name: created_at dtype: string - name: url dtype: string - name: favourite_count dtype: int64 splits: - name: train num_bytes: 60868 num_examples: 201 download_size: 40086 dataset_size: 60868 configs: - config_name: default data_files: - split: train path: data/train-* ---
mrovera/eventnet-ita
--- language: it license: cc-by-sa-4.0 multilinguality: monolingual task_categories: - token-classification tags: - Frame Parsing - Event Extraction --- # EventNet-ITA - Dataset ## Dataset Description ### Dataset Summary EventNet-ITA is a textual dataset annotated _full-text_ with semantic frames in Italian. It can be used to train multi-label models for Frame Parsing or Event Extraction. The schema consists of 205 semantic frames (and relative frame elements) and covers different macro-domains, like conflictual, social, communication, legal, geopolitical, economic and biographical events, among others. The dataset counts 53,843 annotated sentences and over 1,583,000 tokens. For more details, please refer to the [paper](https://aclanthology.org/2024.latechclfl-1.9/). If you want to requests the full documentation of the resource (guidelines, detailed frame-level description, lexical units and annotation examples), please fill out [this form](https://forms.gle/qAgZsf4La9qdzETn6) or email [the author](mailto:eventnetita@gmail.com). ### Annotation Process EventNet-ITA has been annotated at token level, adopting the IOB2 style. The annotation is full-text, i.e., for each sentence any frame mention and all relative frame elements (provided in the schema) are annotated. Example: ``` La O B-EVENT O costruzione B-BUILDING I-EVENT O della B-CREATED_ENTITY I-EVENT O fortificazione I-CREATED_ENTITY I-EVENT O alvitana I-CREATED_ENTITY I-EVENT O risale O B-TEMPORAL_ORIGIN O dunque O O O all' O B-ORIGIN O epoca O I-ORIGIN O dell' O I-ORIGIN O invasione O I-ORIGIN B-INVADING normanna O I-ORIGIN B-INVADER . O O O ``` By convention, in the dataset frame elements are represented as a concatenation of their label name with the name of the corresponding frame. For example, the `CREATED_ENTITY` frame element, associated to the `BUILDING` frame, will be represented as `CREATED_ENTITY*BUILDING`. ### Data format The dataset is formatted as a two-column tsv. The first column contains the token, the second column contains all corresponding labels (both frames and frame elements), separated by `|`. This format makes the dataset ready-to-train with the MaChAmp [multi-sequence](https://github.com/machamp-nlp/machamp/blob/master/docs/multiseq.md) task type. Please see the [model page](https://huggingface.co/mrovera/eventnet-ita) for more details about training. ### Data Split For the sake of reproducibility, the three folds used in the paper are provided. The data split follows a 80/10/10 ratio and has been created in a stratified way. This means each train/dev/test set contains the same relative distribution of (frame) classes. ## Additional Information ### Licensing Information The EventNet-ITA dataset is released under the CC-BY-SA-4.0 License. ### Citation Information If you use EventNet-ITA, please cite the following paper: ``` @inproceedings{rovera-2024-eventnet, title = "{E}vent{N}et-{ITA}: {I}talian Frame Parsing for Events", author = "Rovera, Marco", editor = "Bizzoni, Yuri and Degaetano-Ortlieb, Stefania and Kazantseva, Anna and Szpakowicz, Stan", booktitle = "Proceedings of the 8th Joint SIGHUM Workshop on Computational Linguistics for Cultural Heritage, Social Sciences, Humanities and Literature (LaTeCH-CLfL 2024)", year = "2024", publisher = "Association for Computational Linguistics", pages = "77--90", } ```
AdapterOcean/code_instructions_standardized_cluster_3
--- dataset_info: features: - name: text dtype: string - name: conversation_id dtype: int64 - name: embedding sequence: float64 - name: cluster dtype: int64 splits: - name: train num_bytes: 121262709 num_examples: 10569 download_size: 39576950 dataset_size: 121262709 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "code_instructions_standardized_cluster_3" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Superdetec/Johnlinch
--- license: openrail ---
LLM360/CrystalCoderDatasets
--- language: - en tags: - pretrained license: odc-by --- # Description of the Dataset This release integrates the entire data sequence utilized in the CrystalCoder training. It encompasses data sequences from the three pre-training stages, combining information from two prior works: the [SlimPajama dataset](https://huggingface.co/datasets/cerebras/SlimPajama-627B) and [StarCoder](https://huggingface.co/datasets/bigcode/starcoderdata), totaling approximately 1300 billion tokens. These tokens are distributed across three stages, each with distinct weights. ## Stage 1 During this initial stage, half of the [SlimPajama data](https://huggingface.co/datasets/cerebras/SlimPajama-627B) is utilized, equivalent to approximately 345 billion tokens. ## Stage 2 In the second stage, the remaining half of the [SlimPajama data](https://huggingface.co/datasets/cerebras/SlimPajama-627B) is employed, along with two epochs of [StarCoder data](https://huggingface.co/datasets/bigcode/starcoderdata). For the StarCoder data, we apply [FIM augmentation](https://arxiv.org/abs/2207.14255) with an FIM rate of 0.9 and an SPM rate of 0.5. The total token count for this stage is calculated as 0.5 * 690 + 2 * 291, resulting in 927 billion tokens. ## Stage 3 The third stage involves reusing Python and web-related data from the [StarCoder data](https://huggingface.co/datasets/bigcode/starcoderdata), including HTML, CSS, and JavaScript. This data is utilized for training over three epochs, with the application of FIM at a rate of 0.3 alongside an SPM rate of 0.5. The total token count for this stage is 100 billion. Additionally, a small portion of the SlimPajama dataset, excluding the Github part, is also reused, contributing around 10 billion tokens. ### Instruction tuning (Stage 3a) To enhance the model's proficiency in real chat scenarios, we utilize a diverse set of instruction tuning datasets, totaling approximately 1 billion tokens. Specifically, our data include [OASST1-guanaco](https://huggingface.co/datasets/openaccess-ai-collective/oasst1-guanaco-extended-sharegpt), [SlimOrca](https://huggingface.co/datasets/Open-Orca/SlimOrca), [ShareGPT_V4.3](https://huggingface.co/datasets/Aeala/ShareGPT_Vicuna_unfiltered), [Evol-ShareGPT](https://huggingface.co/datasets/WizardLM/WizardLM_evol_instruct_V2_196k), [CodeAlpaca](https://huggingface.co/datasets/lucasmccabe-lmi/CodeAlpaca-20k), [Rosetta Code](https://github.com/sahil280114/codealpaca/blob/master/data/rosetta_alpaca.json), [Evol-CodeAlpaca 1](https://huggingface.co/datasets/theblackcat102/evol-codealpaca-v1), [Evol-CodeAlpaca 2](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1), and a self-generated dataset centered on website creation through the [Alpaca](https://github.com/tatsu-lab/stanford_alpaca) pipeline. We will release the full dataset soon. The detailed breakdown of the tokens is as followed: ![data split](./data_split.png) # Primary Usage This dataset serves as the foundation for training CrystalCoder and supports further reproduction. For training from scratch, please refer to our [training codes](https://github.com/LLM360/crystalcoder-train). For training from middle checkpoints, please load the dataloader states in checkpoints and follow [this tutorial](https://docs.cerebras.net/en/latest/wsc/tutorials/dataloader-checkpointing.html). # License Pretraining data in langauge model mostly comes from a collection of data sources with various licenses. Any use of all or part of the data here must abide by the terms of the original licenses, including attribution clauses when relevant. We refer users to [SlimPajama dataset](https://huggingface.co/datasets/cerebras/SlimPajama-627B) and [StarCoder](https://huggingface.co/datasets/bigcode/starcoderdata) for detailed license attribution. We release our work under [ODC-BY](https://opendatacommons.org/licenses/by/1-0/), hence granting the rights over the dataset, but not the contents of the dataset individually.
nourlachtar/M2BD
--- dataset_info: features: - name: id dtype: int64 - name: tokens sequence: string - name: ner_tags sequence: int64 splits: - name: train num_bytes: 1643326 num_examples: 758 - name: validation num_bytes: 82025 num_examples: 42 - name: test num_bytes: 83279 num_examples: 43 download_size: 314857 dataset_size: 1808630 --- # Dataset Card for "M2BD" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
ZhangShenao/0.001_idpo_declr_ref_response
--- dataset_info: features: - name: prompt dtype: string - name: prompt_id dtype: string - name: chosen list: - name: content dtype: string - name: role dtype: string - name: rejected list: - name: content dtype: string - name: role dtype: string - name: messages list: - name: content dtype: string - name: role dtype: string - name: score_chosen dtype: float64 - name: score_rejected dtype: float64 - name: reference_response dtype: string splits: - name: train_prefs_1 num_bytes: 164111773 num_examples: 20378 - name: test_prefs_1 num_bytes: 16019213 num_examples: 2000 - name: train_prefs_2 num_bytes: 168308402 num_examples: 20378 - name: test_prefs_2 num_bytes: 16347825 num_examples: 2000 download_size: 201829010 dataset_size: 364787213 configs: - config_name: default data_files: - split: train_prefs_1 path: data/train_prefs_1-* - split: test_prefs_1 path: data/test_prefs_1-* - split: train_prefs_2 path: data/train_prefs_2-* - split: test_prefs_2 path: data/test_prefs_2-* --- # Dataset Card for "0.001_idpo_declr_ref_response" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
W4nkel/dataSet2
--- license: cc-by-sa-4.0 ---
ZhangShenao/0.0001_idpo_noreplacerej_decalpha_ref_response
--- dataset_info: features: - name: prompt dtype: string - name: prompt_id dtype: string - name: chosen list: - name: content dtype: string - name: role dtype: string - name: rejected list: - name: content dtype: string - name: role dtype: string - name: messages list: - name: content dtype: string - name: role dtype: string - name: score_chosen dtype: float64 - name: score_rejected dtype: float64 - name: reference_response dtype: string splits: - name: train_prefs_2 num_bytes: 166494525 num_examples: 20378 - name: test_prefs_2 num_bytes: 16192173 num_examples: 2000 download_size: 101470758 dataset_size: 182686698 configs: - config_name: default data_files: - split: train_prefs_2 path: data/train_prefs_2-* - split: test_prefs_2 path: data/test_prefs_2-* --- # Dataset Card for "0.0001_idpo_noreplacerej_decalpha_ref_response" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
bias-amplified-splits/mnli
--- license: cc-by-4.0 dataset_info: - config_name: minority_examples features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: class_label: names: '0': entailment '1': neutral '2': contradiction - name: idx dtype: int32 splits: - name: train.biased num_bytes: 58497575 num_examples: 309873 - name: train.anti_biased num_bytes: 16122071 num_examples: 82829 - name: validation_matched.biased num_bytes: 1443678 num_examples: 7771 - name: validation_matched.anti_biased num_bytes: 390105 num_examples: 2044 - name: validation_mismatched.biased num_bytes: 1536381 num_examples: 7797 - name: validation_mismatched.anti_biased num_bytes: 412850 num_examples: 2035 download_size: 92308759 dataset_size: 78402660 - config_name: partial_input features: - name: premise dtype: string - name: hypothesis dtype: string - name: label dtype: class_label: names: '0': entailment '1': neutral '2': contradiction - name: idx dtype: int32 splits: - name: train.biased num_bytes: 59529986 num_examples: 309873 - name: train.anti_biased num_bytes: 15089660 num_examples: 82829 - name: validation_matched.biased num_bytes: 1445996 num_examples: 7745 - name: validation_matched.anti_biased num_bytes: 387787 num_examples: 2070 - name: validation_mismatched.biased num_bytes: 1529878 num_examples: 7758 - name: validation_mismatched.anti_biased num_bytes: 419353 num_examples: 2074 download_size: 92308759 dataset_size: 78402660 task_categories: - text-classification language: - en pretty_name: MultiNLI size_categories: - 100K<n<1M --- # Dataset Card for Bias-amplified Splits for MultiNLI ## Table of Contents - [Table of Contents](#table-of-contents) - [Dataset Description](#dataset-description) - [Dataset Summary](#dataset-summary) - [Dataset Structure](#dataset-structure) - [Data Instances](#data-instances) - [Data Fields](#data-fields) - [Data Splits](#data-splits) - [Dataset Creation](#dataset-creation) - [Curation Rationale](#curation-rationale) - [Annotations](#annotations) - [Considerations for Using the Data](#considerations-for-using-the-data) - [Social Impact of Dataset](#social-impact-of-dataset) - [Discussion of Biases](#discussion-of-biases) - [Additional Information](#additional-information) - [Dataset Curators](#dataset-curators) - [Citation Information](#citation-information) ## Dataset Description - **Repository:** [Fighting Bias with Bias repo](https://github.com/schwartz-lab-nlp/fight-bias-with-bias) - **Paper:** [arXiv](https://arxiv.org/abs/2305.18917) - **Point of Contact:** [Yuval Reif](mailto:yuval.reif@mail.huji.ac.il) - **Original Dataset's Paper:** [MultiNLI](https://arxiv.org/abs/1704.05426) ### Dataset Summary Bias-amplified splits is a novel evaluation framework to assess model robustness, by amplifying dataset biases in the training data and challenging models to generalize beyond them. This framework is defined by a bias-amplified training set and a hard, anti-biased test set, which we automatically extract from existing datasets using model-based methods. Our experiments show that the identified anti-biased examples are naturally challenging for models, and moreover, models trained on bias-amplified data exhibit dramatic performance drops on anti-biased examples, which are not mitigated by common approaches to improve generalization. Here we apply our framework to **MultiNLI**, a crowd-sourced collection of 433k sentence pairs annotated with textual entailment information. Our evaluation framework can be applied to any existing dataset, even those considered obsolete, to test model robustness. We hope our work will guide the development of robust models that do not rely on superficial biases and correlations. #### Evaluation Results (DeBERTa-large) ##### For splits based on minority examples: | Training Data \ Test Data | Original test | Anti-biased test | |---------------------------|---------------|------------------| | Original training split | 91.1 | 74.3 | | Biased training split | 88.7 | 57.5 | ##### For splits based on partial-input model: | Training Data \ Test Data | Original test | Anti-biased test | |---------------------------|---------------|------------------| | Original training split | 91.1 | 81.4 | | Biased training split | 89.5 | 71.8 | #### Loading the Data ``` from datasets import load_dataset # choose which bias detection method to use for the bias-amplified splits: either "minority_examples" or "partial_input" dataset = load_dataset("bias-amplified-splits/mnli", "minority_examples") # use the biased training split and anti-biased test split train_dataset = dataset['train.biased'] eval_dataset = dataset['validation_matched.anti_biased'] ``` ## Dataset Structure ### Data Instances Data instances are taken directly from MultiNLI (GLUE version), and re-split into biased and anti-biased subsets. Here is an example of an instance from the dataset: ``` { "idx": 0, "premise": "Your contribution helped make it possible for us to provide our students with a quality education.", "hypothesis": "Your contributions were of no help with our students' education.", "label": 2 } ``` ### Data Fields - `idx`: unique identifier for the example within its original data splits (e.g., validation matched) - `premise`: a piece of text - `hypothesis`: a piece of text that may be true, false, or whose truth conditions may not be knowable when compared to the premise - `label`: one of `0`, `1` and `2` (`entailment`, `neutral`, and `contradiction`) ### Data Splits Bias-amplified splits require a method to detect *biased* and *anti-biased* examples in datasets. We release bias-amplified splits based created with each of these two methods: - **Minority examples**: A novel method we introduce that leverages representation learning and clustering for identifying anti-biased *minority examples* (Tu et al., 2020)—examples that defy common statistical patterns found in the rest of the dataset. - **Partial-input baselines**: A common method for identifying biased examples containing annotation artifacts in a dataset, which examines the performance of models that are restricted to using only part of the input. Such models, if successful, are bound to rely on unintended or spurious patterns in the dataset. Using each of the two methods, we split each of the original train and test splits into biased and anti-biased subsets. See the [paper](https://arxiv.org/abs/2305.18917) for more details. #### Minority Examples | Dataset Split | Number of Instances in Split | |-------------------------------------|------------------------------| | Train - biased | 309873 | | Train - anti-biased | 82829 | | Validation matched - biased | 7771 | | Validation matched - anti-biased | 2044 | | Validation mismatched - biased | 7797 | | Validation mismatched - anti-biased | 2035 | #### Partial-input Baselines | Dataset Split | Number of Instances in Split | |-------------------------------------|------------------------------| | Train - biased | 309873 | | Train - anti-biased | 82829 | | Validation matched - biased | 7745 | | Validation matched - anti-biased | 2070 | | Validation mismatched - biased | 7758 | | Validation mismatched - anti-biased | 2074 | ## Dataset Creation ### Curation Rationale NLP models often rely on superficial cues known as *dataset biases* to achieve impressive performance, and can fail on examples where these biases do not hold. To develop more robust, unbiased models, recent work aims to filter bisased examples from training sets. We argue that in order to encourage the development of robust models, we should in fact **amplify** biases in the training sets, while adopting the challenge set approach and making test sets anti-biased. To implement our approach, we introduce a simple framework that can be applied automatically to any existing dataset to use it for testing model robustness. ### Annotations #### Annotation process No new annotations are required to create bias-amplified splits. Existing data instances are split into *biased* and *anti-biased* splits based on automatic model-based methods to detect such examples. ## Considerations for Using the Data ### Social Impact of Dataset Bias-amplified splits were created to promote the development of robust NLP models that do not rely on superficial biases and correlations, and provide more challenging evaluation of existing systems. ### Discussion of Biases We propose to use bias-amplified splits to complement benchmarks with challenging evaluation settings that test model robustness, in addition to the dataset’s main training and test sets. As such, while existing dataset biases are *amplified* during training with bias-amplified splits, these splits are intended primarily for model evaluation, to expose the bias-exploiting behaviors of models and to identify more robsut models and effective robustness interventions. ## Additional Information ### Dataset Curators Bias-amplified splits were introduced by Yuval Reif and Roy Schwartz from the [Hebrew University of Jerusalem](https://schwartz-lab-huji.github.io). MultiNLI was developed by Adina Williams, Nikita Nangia and Samuel Bowman. ### Citation Information ``` @misc{reif2023fighting, title = "Fighting Bias with Bias: Promoting Model Robustness by Amplifying Dataset Biases", author = "Yuval Reif and Roy Schwartz", month = may, year = "2023", url = "https://arxiv.org/pdf/2305.18917", } ``` Source dataset: ``` @InProceedings{N18-1101, author = "Williams, Adina and Nangia, Nikita and Bowman, Samuel", title = "A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference", booktitle = "Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)", year = "2018", publisher = "Association for Computational Linguistics", pages = "1112--1122", location = "New Orleans, Louisiana", url = "http://aclweb.org/anthology/N18-1101" } ```
HAERAE-HUB/HAE-RAE-COT-1.5M
--- language: - ko license: cc-by-4.0 size_categories: - 1M<n<10M dataset_info: features: - name: Task dtype: string - name: Question dtype: string - name: CoT_Rationale dtype: string - name: __index_level_0__ dtype: int64 splits: - name: train num_bytes: 1789334780 num_examples: 1586688 download_size: 1053794527 dataset_size: 1789334780 tags: - haerae configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "HAE-RAE-COT-1.5M" HAE-RAE-COT-1.5M is a dataset encompassing 1,586,688 samples of questions paired with CoT (Chain of Thought) rationales. The majority of this dataset is a translation of samples from the [CoT-Collection](https://huggingface.co/datasets/kaist-ai/CoT-Collection), with a portion of samples derived from Korean datasets through the utilization of the gpt-3.5-turbo API. The translation of the CoT-Collection was carried out using the NLLB 600M model. To the best of our knowledge, HAE-RAE-COT-1.5M represents the largest available CoT or instruction data in the Korean language. # Summary of included datasets. | Dataset | Count | % | License | |------------------------------------------------------------------------------|---------|--------|--------| | CoT Collection | 1,545,112 | 97.38% | non-commercial use | | [LEAP_NLI](https://github.com/onspark/LEAP_NLI_v2.0) | 11,547 | 0.0072%| unknown | | [squad_kor_v1](https://huggingface.co/datasets/squad_kor_v1) | 9,606 | 0.0061%| cc-by-nd-4.0| | [mwp_kor_v2](https://github.com/jkc-ai/mwp_kr_data) | 9,327 | 0.0059%| Apache License 2.0| | [KorWikiTQ](https://github.com/LG-NLP/KorWikiTableQuestions) | 8,276 | 0.0052%| CC BY-SA 4.0 | | [mwp_kor_v3](https://github.com/tunib-ai/KMWP) | 2,820 | 0.0018%| CC-BY-NC-SA 4.0| | **Total** | 1,586,688 | 100.00%| # License HAE-RAE-COT-1.5M is only for non-commercial use and is subject to OpenAI's Terms of Use for the generated data. # Point of Contact For any questions contact us via the following email:) ``` spthsrbwls123@yonsei.ac.kr ``` # Contributors [Baek Sang Won](http://www.linkedin.com/in/sangwon-baek-74a3241b7), [Guijin Son](https://github.com/guijinSON)
mask-distilled-one-sec-cv12/chunk_134
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1211009992 num_examples: 237826 download_size: 1235978510 dataset_size: 1211009992 --- # Dataset Card for "chunk_134" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
patruff/chucklesMistralB
--- dataset_info: features: - name: original dtype: string - name: chucklebot dtype: string splits: - name: train num_bytes: 671498 num_examples: 2532 - name: test num_bytes: 169024 num_examples: 634 download_size: 387571 dataset_size: 840522 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* ---
yardeny/tokenized_gpt2_context_len_32
--- dataset_info: features: - name: input_ids sequence: int32 - name: attention_mask sequence: int8 splits: - name: train num_bytes: 6891465229 num_examples: 80462898 download_size: 3033421664 dataset_size: 6891465229 --- # Dataset Card for "tokenized_gpt2_context_len_32" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
CloudTron/ConvToolFormer
--- license: cc-by-4.0 --- This is a [long conversation with toolformer] api calls dataset generated by GPT3.5. There are 61900 conversations currently, each one of them have 10–15 turns with 2-3 API calls.
JefferyZhan/Language-prompted-Localization-Dataset
--- license: cc-by-nc-4.0 ---
nath720/colabstable
--- license: openrail ---
vrish/nasa-ads-1
--- license: apache-2.0 ---
AdapterOcean/physics_dataset_standardized_cluster_2_alpaca
--- dataset_info: features: - name: input dtype: string - name: output dtype: string splits: - name: train num_bytes: 16864397 num_examples: 5571 download_size: 0 dataset_size: 16864397 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "physics_dataset_standardized_cluster_2_alpaca" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
dhiruHF/research_paper_multi_label_data_1k
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 1215293 num_examples: 1000 download_size: 662344 dataset_size: 1215293 --- # Dataset Card for "research_paper_multi_label_data_1k" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
na2s/nass2ss
--- license: other ---
quccili/invoice
--- configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* dataset_info: features: - name: image dtype: image - name: ground_truth dtype: string splits: - name: train num_bytes: 14986534.0 num_examples: 18 - name: validation num_bytes: 14986534.0 num_examples: 18 - name: test num_bytes: 14986534.0 num_examples: 18 download_size: 39577947 dataset_size: 44959602.0 --- # Dataset Card for "invoice" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
open-llm-leaderboard/details_janhq__supermario-v2
--- pretty_name: Evaluation run of janhq/supermario-v2 dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [janhq/supermario-v2](https://huggingface.co/janhq/supermario-v2) on the [Open\ \ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_janhq__supermario-v2\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2023-12-12T05:33:32.497051](https://huggingface.co/datasets/open-llm-leaderboard/details_janhq__supermario-v2/blob/main/results_2023-12-12T05-33-32.497051.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533206964078709,\n\ \ \"acc_stderr\": 0.03205268515858169,\n \"acc_norm\": 0.6531030767387064,\n\ \ \"acc_norm_stderr\": 0.03271825548664744,\n \"mc1\": 0.44430844553243576,\n\ \ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.605797177274584,\n\ \ \"mc2_stderr\": 0.015128279082831566\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.6578498293515358,\n \"acc_stderr\": 0.013864152159177275,\n\ \ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6763592909778928,\n\ \ \"acc_stderr\": 0.004669085411342194,\n \"acc_norm\": 0.8650667197769368,\n\ \ \"acc_norm_stderr\": 0.0034095405332498414\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\ \ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\ \ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\ \ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\ \ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \ \ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\ \ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\ \ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\ \ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\ \ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\ \ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\ \ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\ \ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\ \ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\ \ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\ \ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\ \ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\ \ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\ acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\ \ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\ \ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \ \ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181015,\n \"\ acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181015\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"\ acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\ : 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\ \ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\ acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\ \ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657266,\n\ \ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657266\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \ \ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\ \ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\ acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\ acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\ acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\ acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601432,\n \ \ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601432\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\ \ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\ \ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462472,\n\ \ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462472\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\ acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\ \ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\ \ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\ \ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\ \ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\ \ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\ \ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\ \ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\ \ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \ \ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\ \ \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n\ \ \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\ \ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38994413407821227,\n\ \ \"acc_stderr\": 0.01631237662921307,\n \"acc_norm\": 0.38994413407821227,\n\ \ \"acc_norm_stderr\": 0.01631237662921307\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n\ \ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\ \ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\ \ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959607,\n\ \ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959607\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \ \ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\ \ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\ \ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\ \ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \ \ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\ \ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\ \ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\ \ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\ \ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\ \ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \ \ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\ \ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\ \ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\ \ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.44430844553243576,\n\ \ \"mc1_stderr\": 0.017394586250743173,\n \"mc2\": 0.605797177274584,\n\ \ \"mc2_stderr\": 0.015128279082831566\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.813733228097869,\n \"acc_stderr\": 0.010941877955676206\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7217589082638363,\n \ \ \"acc_stderr\": 0.012343803671422682\n }\n}\n```" repo_url: https://huggingface.co/janhq/supermario-v2 leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|arc:challenge|25_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2023-12-12T05-33-32.497051.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|gsm8k|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hellaswag|10_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-international_law|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-management|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-marketing|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-sociology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-virology|5_2023-12-12T05-33-32.497051.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-management|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-virology|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2023-12-12T05-33-32.497051.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|truthfulqa:mc|0_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2023-12-12T05-33-32.497051.parquet' - config_name: harness_winogrande_5 data_files: - split: 2023_12_12T05_33_32.497051 path: - '**/details_harness|winogrande|5_2023-12-12T05-33-32.497051.parquet' - split: latest path: - '**/details_harness|winogrande|5_2023-12-12T05-33-32.497051.parquet' - config_name: results data_files: - split: 2023_12_12T05_33_32.497051 path: - results_2023-12-12T05-33-32.497051.parquet - split: latest path: - results_2023-12-12T05-33-32.497051.parquet --- # Dataset Card for Evaluation run of janhq/supermario-v2 <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [janhq/supermario-v2](https://huggingface.co/janhq/supermario-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_janhq__supermario-v2", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2023-12-12T05:33:32.497051](https://huggingface.co/datasets/open-llm-leaderboard/details_janhq__supermario-v2/blob/main/results_2023-12-12T05-33-32.497051.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6533206964078709, "acc_stderr": 0.03205268515858169, "acc_norm": 0.6531030767387064, "acc_norm_stderr": 0.03271825548664744, "mc1": 0.44430844553243576, "mc1_stderr": 0.017394586250743173, "mc2": 0.605797177274584, "mc2_stderr": 0.015128279082831566 }, "harness|arc:challenge|25": { "acc": 0.6578498293515358, "acc_stderr": 0.013864152159177275, "acc_norm": 0.6851535836177475, "acc_norm_stderr": 0.013572657703084948 }, "harness|hellaswag|10": { "acc": 0.6763592909778928, "acc_stderr": 0.004669085411342194, "acc_norm": 0.8650667197769368, "acc_norm_stderr": 0.0034095405332498414 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.33, "acc_stderr": 0.04725815626252605, "acc_norm": 0.33, "acc_norm_stderr": 0.04725815626252605 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6296296296296297, "acc_stderr": 0.041716541613545426, "acc_norm": 0.6296296296296297, "acc_norm_stderr": 0.041716541613545426 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.6907894736842105, "acc_stderr": 0.037610708698674805, "acc_norm": 0.6907894736842105, "acc_norm_stderr": 0.037610708698674805 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.63, "acc_stderr": 0.04852365870939099, "acc_norm": 0.63, "acc_norm_stderr": 0.04852365870939099 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.7056603773584905, "acc_stderr": 0.02804918631569525, "acc_norm": 0.7056603773584905, "acc_norm_stderr": 0.02804918631569525 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.7847222222222222, "acc_stderr": 0.03437079344106135, "acc_norm": 0.7847222222222222, "acc_norm_stderr": 0.03437079344106135 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.53, "acc_stderr": 0.050161355804659205, "acc_norm": 0.53, "acc_norm_stderr": 0.050161355804659205 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6647398843930635, "acc_stderr": 0.03599586301247077, "acc_norm": 0.6647398843930635, "acc_norm_stderr": 0.03599586301247077 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.45098039215686275, "acc_stderr": 0.04951218252396264, "acc_norm": 0.45098039215686275, "acc_norm_stderr": 0.04951218252396264 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.78, "acc_stderr": 0.04163331998932263, "acc_norm": 0.78, "acc_norm_stderr": 0.04163331998932263 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.5957446808510638, "acc_stderr": 0.03208115750788684, "acc_norm": 0.5957446808510638, "acc_norm_stderr": 0.03208115750788684 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.4824561403508772, "acc_stderr": 0.04700708033551038, "acc_norm": 0.4824561403508772, "acc_norm_stderr": 0.04700708033551038 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.5241379310344828, "acc_stderr": 0.0416180850350153, "acc_norm": 0.5241379310344828, "acc_norm_stderr": 0.0416180850350153 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.42857142857142855, "acc_stderr": 0.02548718714785938, "acc_norm": 0.42857142857142855, "acc_norm_stderr": 0.02548718714785938 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.4365079365079365, "acc_stderr": 0.04435932892851466, "acc_norm": 0.4365079365079365, "acc_norm_stderr": 0.04435932892851466 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.35, "acc_stderr": 0.047937248544110196, "acc_norm": 0.35, "acc_norm_stderr": 0.047937248544110196 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.7741935483870968, "acc_stderr": 0.023785577884181015, "acc_norm": 0.7741935483870968, "acc_norm_stderr": 0.023785577884181015 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.47783251231527096, "acc_stderr": 0.03514528562175008, "acc_norm": 0.47783251231527096, "acc_norm_stderr": 0.03514528562175008 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.71, "acc_stderr": 0.045604802157206845, "acc_norm": 0.71, "acc_norm_stderr": 0.045604802157206845 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.7757575757575758, "acc_stderr": 0.03256866661681102, "acc_norm": 0.7757575757575758, "acc_norm_stderr": 0.03256866661681102 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.7929292929292929, "acc_stderr": 0.028869778460267045, "acc_norm": 0.7929292929292929, "acc_norm_stderr": 0.028869778460267045 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9015544041450777, "acc_stderr": 0.021500249576033456, "acc_norm": 0.9015544041450777, "acc_norm_stderr": 0.021500249576033456 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.6717948717948717, "acc_stderr": 0.023807633198657266, "acc_norm": 0.6717948717948717, "acc_norm_stderr": 0.023807633198657266 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.36666666666666664, "acc_stderr": 0.029381620726465066, "acc_norm": 0.36666666666666664, "acc_norm_stderr": 0.029381620726465066 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.6848739495798319, "acc_stderr": 0.030176808288974337, "acc_norm": 0.6848739495798319, "acc_norm_stderr": 0.030176808288974337 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.3443708609271523, "acc_stderr": 0.038796870240733264, "acc_norm": 0.3443708609271523, "acc_norm_stderr": 0.038796870240733264 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8550458715596331, "acc_stderr": 0.01509421569970048, "acc_norm": 0.8550458715596331, "acc_norm_stderr": 0.01509421569970048 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5509259259259259, "acc_stderr": 0.03392238405321617, "acc_norm": 0.5509259259259259, "acc_norm_stderr": 0.03392238405321617 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8137254901960784, "acc_stderr": 0.027325470966716312, "acc_norm": 0.8137254901960784, "acc_norm_stderr": 0.027325470966716312 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.7974683544303798, "acc_stderr": 0.026160568246601432, "acc_norm": 0.7974683544303798, "acc_norm_stderr": 0.026160568246601432 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.6905829596412556, "acc_stderr": 0.03102441174057221, "acc_norm": 0.6905829596412556, "acc_norm_stderr": 0.03102441174057221 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8015267175572519, "acc_stderr": 0.03498149385462472, "acc_norm": 0.8015267175572519, "acc_norm_stderr": 0.03498149385462472 }, "harness|hendrycksTest-international_law|5": { "acc": 0.8099173553719008, "acc_stderr": 0.03581796951709282, "acc_norm": 0.8099173553719008, "acc_norm_stderr": 0.03581796951709282 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.7777777777777778, "acc_stderr": 0.040191074725573483, "acc_norm": 0.7777777777777778, "acc_norm_stderr": 0.040191074725573483 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7730061349693251, "acc_stderr": 0.03291099578615769, "acc_norm": 0.7730061349693251, "acc_norm_stderr": 0.03291099578615769 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.45535714285714285, "acc_stderr": 0.047268355537191, "acc_norm": 0.45535714285714285, "acc_norm_stderr": 0.047268355537191 }, "harness|hendrycksTest-management|5": { "acc": 0.7572815533980582, "acc_stderr": 0.04245022486384495, "acc_norm": 0.7572815533980582, "acc_norm_stderr": 0.04245022486384495 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8760683760683761, "acc_stderr": 0.021586494001281376, "acc_norm": 0.8760683760683761, "acc_norm_stderr": 0.021586494001281376 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.72, "acc_stderr": 0.04512608598542128, "acc_norm": 0.72, "acc_norm_stderr": 0.04512608598542128 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8301404853128991, "acc_stderr": 0.013428186370608313, "acc_norm": 0.8301404853128991, "acc_norm_stderr": 0.013428186370608313 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.7312138728323699, "acc_stderr": 0.023868003262500104, "acc_norm": 0.7312138728323699, "acc_norm_stderr": 0.023868003262500104 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.38994413407821227, "acc_stderr": 0.01631237662921307, "acc_norm": 0.38994413407821227, "acc_norm_stderr": 0.01631237662921307 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7352941176470589, "acc_stderr": 0.025261691219729477, "acc_norm": 0.7352941176470589, "acc_norm_stderr": 0.025261691219729477 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.7138263665594855, "acc_stderr": 0.025670259242188933, "acc_norm": 0.7138263665594855, "acc_norm_stderr": 0.025670259242188933 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.7561728395061729, "acc_stderr": 0.023891879541959607, "acc_norm": 0.7561728395061729, "acc_norm_stderr": 0.023891879541959607 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.48936170212765956, "acc_stderr": 0.029820747191422473, "acc_norm": 0.48936170212765956, "acc_norm_stderr": 0.029820747191422473 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.46936114732724904, "acc_stderr": 0.012746237711716634, "acc_norm": 0.46936114732724904, "acc_norm_stderr": 0.012746237711716634 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.6911764705882353, "acc_stderr": 0.02806499816704009, "acc_norm": 0.6911764705882353, "acc_norm_stderr": 0.02806499816704009 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.6830065359477124, "acc_stderr": 0.018824219512706207, "acc_norm": 0.6830065359477124, "acc_norm_stderr": 0.018824219512706207 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.6727272727272727, "acc_stderr": 0.0449429086625209, "acc_norm": 0.6727272727272727, "acc_norm_stderr": 0.0449429086625209 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7306122448979592, "acc_stderr": 0.02840125202902294, "acc_norm": 0.7306122448979592, "acc_norm_stderr": 0.02840125202902294 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8606965174129353, "acc_stderr": 0.024484487162913973, "acc_norm": 0.8606965174129353, "acc_norm_stderr": 0.024484487162913973 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.86, "acc_stderr": 0.034873508801977704, "acc_norm": 0.86, "acc_norm_stderr": 0.034873508801977704 }, "harness|hendrycksTest-virology|5": { "acc": 0.5421686746987951, "acc_stderr": 0.0387862677100236, "acc_norm": 0.5421686746987951, "acc_norm_stderr": 0.0387862677100236 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8245614035087719, "acc_stderr": 0.029170885500727665, "acc_norm": 0.8245614035087719, "acc_norm_stderr": 0.029170885500727665 }, "harness|truthfulqa:mc|0": { "mc1": 0.44430844553243576, "mc1_stderr": 0.017394586250743173, "mc2": 0.605797177274584, "mc2_stderr": 0.015128279082831566 }, "harness|winogrande|5": { "acc": 0.813733228097869, "acc_stderr": 0.010941877955676206 }, "harness|gsm8k|5": { "acc": 0.7217589082638363, "acc_stderr": 0.012343803671422682 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
NeSTudio/NestQuad
--- license: apache-2.0 task_categories: - question-answering language: - ru pretty_name: nestquad size_categories: - 10K<n<100K --- # NestQuad Это датасет, который является объединением Sberquad и нашего датасета, созданного при помощи метода wizard. Используется для Q&A системы | <!-- --> | <!-- --> | |----------|----------| | Размерность | 75300 | | Аугментация по контекстам | 5.48 | | Актуальность | 2023 | | Объективность оценочная | 70% | | Объективность структура | 95% | | Целостность | 90% | | Релевантность | 60% | | Совместимость | 90% | | Уникальных ответов |49161 | | Уникальные контексты | 13728 | Структура датасета: | id | cluster | title | context | question | answers | answers_start | answers_end | |:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:|:----------:| | Уникальный ID | Множество общих тем | Множество конкретных тем | Контекст | Вопрос | Ответ | Начало ответа в контексте | Конец ответа в контексте | | Источники| |----------| | Sberquad (huggingface) - https://huggingface.co/datasets/sberquad | | Информация про туризм - https://tour-poisk.com/articles/, https://www.sravni.ru/enciklopediya/turizm/oteli/ | @MISC{NestQuad, author url year = {Emelyanov Anton, Nosov Andrey, Chernikov Kirill, Veselinovich Aleksandra, Nastalovskaya Tasia, Rastopshin Andrey}, title = {Russian dataset for Instruct/Chat models}, = {https://huggingface.co/datasets/NeSTudio/NestQuad}, 2023
mayank1307/pdp_tokens
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 2439583 num_examples: 9105 download_size: 560074 dataset_size: 2439583 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "pdp_tokens" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
allganize/math_table_qa
--- language: - ko --- # mathqa-ko - `mathqa-ko` 데이터는 일반(general) 도메인의 QA 데이터셋입니다. Context와 Question이 주어졌을 때, 질문에 상응하는 답을 생성해야 합니다. 입력값으로는 text만이 주어집니다. ### Test Data의 구축 - 원본 데이터의 passage, question number(와 unit)을 그대로 사용합니다. - Test를 위해 랜덤하게 90개를 선택하였습니다. ### 데이터 출처 - [숫자연산 기계독해 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=data&dataSetSn=71568) # tableqa-ko - `tableqa-ko` 데이터는 일반(general) 도메인의 QA 데이터셋입니다. Context와 Question이 주어졌을 때 질문에 상응하는 답을 생성해야 하며, 입력값으로는 text와 table이 함께 주어집니다. `conversation_id`에 '`tableqa-short`'이 포함된 경우에는 3어절 이내의 짧은 답변을 생성해야 하는 반면, '`tableqa-long`'이 포함된 경우에는 장문의 텍스트를 생성해야 합니다. ### Test Data의 구축 - 원본데이터의 context와 question을 사용합니다. LLM의 자연스러운 답변을 위해 GPT-4로 자연스러운 답변을 생성하였습니다. - Test를 위해 랜덤하게 104건의 데이터를 선태갛였고 tableqa-short은 32건, tableqa-long은 72건입니다. ### 데이터 출처 - [AIHUB의 표 정보 질의응답 데이터](https://www.aihub.or.kr/aihubdata/data/view.do?currMenu=115&topMenu=100&aihubDataSe=data&dataSetSn=71565) # 데이터의 비공개 사유 [AIhub의 데이터 이용정책](https://www.aihub.or.kr/intrcn/guid/usagepolicy.do?currMenu=151&topMenu=105)의 AI 허브 개방 데이터 > 데이터 이용정책 5항에 의거 테스트데이터를 공개할 수 없음을 고지합니다.
distilled-one-sec-cv12-each-chunk-uniq/chunk_174
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1125570768.0 num_examples: 219324 download_size: 1153707611 dataset_size: 1125570768.0 --- # Dataset Card for "chunk_174" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
jxta/em
--- license: apache-2.0 ---
hope04302/plantVillageDataset
--- license: unknown ---
efoley/sar_tile_512
--- dataset_info: features: - name: image dtype: image splits: - name: train num_bytes: 255531308.653 num_examples: 1673 download_size: 239797141 dataset_size: 255531308.653 configs: - config_name: default data_files: - split: train path: data/train-* ---
Muennighoff/natural-instructions
--- annotations_creators: - crowdsourced - expert-generated language: - en multilinguality: - monolingual size_categories: - 100M<n<1B task_categories: - other --- Preprocessed version of Super-Natural-Instructions from https://github.com/allenai/natural-instructions/tree/master/splits. The same inputs may appear with different outputs, thus to avoid duplicate inputs, you can deduplicate by the `id` or the `inputs` field. Train Tasks: ``` ['task001_quoref_question_generation', 'task002_quoref_answer_generation', 'task022_cosmosqa_passage_inappropriate_binary', 'task023_cosmosqa_question_generation', 'task024_cosmosqa_answer_generation', 'task025_cosmosqa_incorrect_answer_generation', 'task026_drop_question_generation', 'task027_drop_answer_type_generation', 'task028_drop_answer_generation', 'task043_essential_terms_answering_incomplete_questions', 'task044_essential_terms_identifying_essential_words', 'task045_miscellaneous_sentence_paraphrasing', 'task046_miscellaneous_question_typing', 'task047_miscellaneous_answering_science_questions', 'task059_ropes_story_generation', 'task060_ropes_question_generation', 'task061_ropes_answer_generation', 'task062_bigbench_repeat_copy_logic', 'task063_first_i_elements', 'task064_all_elements_except_first_i', 'task065_timetravel_consistent_sentence_classification', 'task066_timetravel_binary_consistency_classification', 'task067_abductivenli_answer_generation', 'task068_abductivenli_incorrect_answer_generation', 'task069_abductivenli_classification', 'task070_abductivenli_incorrect_classification', 'task071_abductivenli_answer_generation', 'task072_abductivenli_answer_generation', 'task073_commonsenseqa_answer_generation', 'task074_squad1.1_question_generation', 'task075_squad1.1_answer_generation', 'task076_splash_correcting_sql_mistake', 'task077_splash_explanation_to_sql', 'task078_all_elements_except_last_i', 'task079_conala_concat_strings', 'task080_piqa_answer_generation', 'task081_piqa_wrong_answer_generation', 'task082_babi_t1_single_supporting_fact_question_generation', 'task083_babi_t1_single_supporting_fact_answer_generation', 'task084_babi_t1_single_supporting_fact_identify_relevant_fact', 'task085_unnatural_addsub_arithmetic', 'task087_new_operator_addsub_arithmetic', 'task088_identify_typo_verification', 'task089_swap_words_verification', 'task090_equation_learner_algebra', 'task091_all_elements_from_index_i_to_j', 'task092_check_prime_classification', 'task093_conala_normalize_lists', 'task094_conala_calculate_mean', 'task095_conala_max_absolute_value', 'task096_conala_list_index_subtraction', 'task097_conala_remove_duplicates', 'task098_conala_list_intersection', 'task099_reverse_elements_between_index_i_and_j', 'task100_concatenate_all_elements_from_index_i_to_j', 'task101_reverse_and_concatenate_all_elements_from_index_i_to_j', 'task103_facts2story_long_text_generation', 'task104_semeval_2019_task10_closed_vocabulary_mathematical_answer_generation', 'task105_story_cloze-rocstories_sentence_generation', 'task107_splash_question_to_sql', 'task1087_two_number_sum', 'task1088_array_of_products', 'task1089_check_monotonic_array', 'task108_contextualabusedetection_classification', 'task109_smsspamcollection_spamsmsdetection', 'task110_logic2text_sentence_generation', 'task111_asset_sentence_simplification', 'task112_asset_simple_sentence_identification', 'task1135_xcsr_en_commonsense_mc_classification', 'task113_count_frequency_of_letter', 'task1146_country_capital', 'task1147_country_currency', 'task1148_maximum_ascii_value', 'task1149_item_check_edible', 'task114_is_the_given_word_longest', 'task1150_delete_max_min', 'task1151_swap_max_min', 'task115_help_advice_classification', 'task1167_penn_treebank_coarse_pos_tagging', 'task1168_brown_coarse_pos_tagging', 'task116_com2sense_commonsense_reasoning', 'task1186_nne_hrngo_classification', 'task1188_count_max_freq_char', 'task1189_check_char_in_string', 'task118_semeval_2019_task10_open_vocabulary_mathematical_answer_generation', 'task1190_add_integer_to_list', 'task1191_food_veg_nonveg', 'task1192_food_flavor_profile', 'task1193_food_course_classification', 'task1194_kth_largest_element', 'task1196_atomic_classification_oeffect', 'task1197_atomic_classification_oreact', 'task1198_atomic_classification_owant', 'task1199_atomic_classification_xattr', 'task119_semeval_2019_task10_geometric_mathematical_answer_generation', 'task1200_atomic_classification_xeffect', 'task1201_atomic_classification_xintent', 'task1202_atomic_classification_xneed', 'task1203_atomic_classification_xreact', 'task1204_atomic_classification_hinderedby', 'task1205_atomic_classification_isafter', 'task1206_atomic_classification_isbefore', 'task1207_atomic_classification_atlocation', 'task1208_atomic_classification_xreason', 'task1209_atomic_classification_objectuse', 'task1210_atomic_classification_madeupof', 'task1211_atomic_classification_hassubevent', 'task1212_atomic_classification_hasproperty', 'task1213_atomic_classification_desires', 'task1214_atomic_classification_xwant', 'task1215_atomic_classification_capableof', 'task1216_atomic_classification_causes', 'task1217_atomic_answer_generation', 'task122_conala_list_index_addition', 'task123_conala_sort_dictionary', 'task124_conala_pair_averages', 'task125_conala_pair_differences', 'task126_scan_structured_text_generation_command_action_all', 'task127_scan_long_text_generation_action_command_all', 'task1283_hrngo_quality_classification', 'task1284_hrngo_informativeness_classification', 'task1285_kpa_keypoint_matching', 'task1286_openbookqa_question_answering', 'task1288_glue_mrpc_paraphrasing', 'task1289_trec_classification', 'task128_scan_structured_text_generation_command_action_short', 'task1290_xsum_summarization', 'task1291_multi_news_summarization', 'task1292_yelp_review_full_text_categorization', 'task1293_kilt_tasks_hotpotqa_question_answering', 'task1294_wiki_qa_answer_verification', 'task1295_adversarial_qa_question_answering', 'task1296_wiki_hop_question_answering', 'task129_scan_long_text_generation_action_command_short', 'task1308_amazonreview_category_classification', 'task1309_amazonreview_summary_classification', 'task130_scan_structured_text_generation_command_action_long', 'task1310_amazonreview_rating_classification', 'task1311_amazonreview_rating_classification', 'task1312_amazonreview_polarity_classification', 'task1313_amazonreview_polarity_classification', 'task1314_country_abbreviation', 'task1315_find_range_array', 'task1316_remove_duplicates_string', 'task1317_country_calling_code', 'task1318_country_national_dish', 'task1319_country_by_barcode_prefix', 'task131_scan_long_text_generation_action_command_long', 'task1320_country_domain_tld', 'task1321_country_continent', 'task1322_country_government_type', 'task1325_qa_zre_question_generation_on_subject_relation', 'task1326_qa_zre_question_generation_from_answer', 'task1327_qa_zre_answer_generation_from_question', 'task1328_qa_zre_relation_generation_from_question', 'task132_dais_text_modification', 'task1331_reverse_array', 'task1332_check_leap_year', 'task1333_check_validity_date_ddmmyyyy', 'task1336_peixian_equity_evaluation_corpus_gender_classifier', 'task1338_peixian_equity_evaluation_corpus_sentiment_classifier', 'task1339_peixian_equity_evaluation_corpus_text_completion', 'task1340_msr_text_compression_compression', 'task1341_msr_text_classification', 'task1346_glue_cola_grammatical_correctness_classification', 'task1347_glue_sts-b_similarity_classification', 'task1354_sent_comp_classification', 'task1355_sent_comp_summarization', 'task1359_numer_sense_answer_generation', 'task1360_numer_sense_multiple_choice_qa_generation', 'task1361_movierationales_classification', 'task1364_hans_answer_generation', 'task1366_healthfact_classification', 'task1368_healthfact_sentence_generation', 'task1369_healthfact_sentence_generation', 'task1378_quarel_correct_answer_generation', 'task1379_quarel_incorrect_answer_generation', 'task137_detoxifying-lms_classification_toxicity', 'task1380_quarel_correct_option_generation', 'task1381_quarel_incorrect_option_generation', 'task1382_quarel_write_correct_answer', 'task1383_quarel_write_incorrect_answer', 'task1384_deal_or_no_dialog_classification', 'task1389_hellaswag_completion', 'task138_detoxifying-lms_classification_fluency', 'task1398_obqa_question_generation', 'task1399_obqa_answer_generation', 'task139_detoxifying-lms_classification_topicality', 'task1400_obqa_incorrect_answer_generation', 'task1401_obqa_sentence_generation', 'task1403_check_validity_date_mmddyyyy', 'task1404_date_conversion', 'task1405_find_median', 'task1406_kth_smallest_element', 'task140_detoxifying-lms_classification_style', 'task1412_web_questions_question_answering', 'task1418_bless_semantic_relation_classification', 'task1419_mathqa_gain', 'task141_odd-man-out_classification_category', 'task1420_mathqa_general', 'task1421_mathqa_other', 'task1422_mathqa_physics', 'task1423_mathqa_geometry', 'task1424_mathqa_probability', 'task1425_country_iso_numeric', 'task1426_country_independence_year', 'task1427_country_region_in_world', 'task1428_country_surface_area', 'task1429_evalution_semantic_relation_classification', 'task142_odd-man-out_classification_no_category', 'task1431_head_qa_answer_generation', 'task1434_head_qa_classification', 'task143_odd-man-out_classification_generate_category', 'task1443_string_to_number', 'task1444_round_power_of_two', 'task1445_closest_integers', 'task1446_farthest_integers', 'task1447_drug_extraction_ade', 'task1448_disease_entity_extraction_ncbi_dataset', 'task1449_disease_entity_extraction_bc5cdr_dataset', 'task144_subjqa_question_answering', 'task1451_drug_dose_extraction', 'task1452_location_entity_extraction_btc_corpus', 'task1453_person_entity_extraction_btc_corpus', 'task145_afs_argument_similarity_death_penalty', 'task146_afs_argument_similarity_gun_control', 'task1479_organization_entity_extraction_btc_corpus', 'task147_afs_argument_similarity_gay_marriage', 'task1480_gene_extraction_jnlpba_dataset', 'task1481_gene_extraction_bc2gm_dataset', 'task1482_gene_extraction_chemprot_dataset', 'task1483_chemical_extraction_chemprot_dataset', 'task1484_gene_extraction_linnaeus_dataset', 'task1485_organ_extraction_anem_dataset', 'task1486_cell_extraction_anem_dataset', 'task1487_organism_substance_extraction_anem_dataset', 'task1488_sarcasmdetection_headline_classification', 'task1489_sarcasmdetection_tweet_classification', 'task148_afs_argument_quality_gay_marriage', 'task1495_adverse_drug_event_classification', 'task1498_24hour_to_12hour_clock', 'task1499_dstc3_summarization', 'task149_afs_argument_quality_death_penalty', 'task1500_dstc3_classification', 'task1501_dstc3_answer_generation', 'task1502_hatexplain_classification', 'task1503_hatexplain_classification', 'task1504_hatexplain_answer_generation', 'task1505_root09_semantic_relation_classification', 'task1506_celebrity_minimal_dob_span', 'task1507_boolean_temporal_reasoning', 'task1508_wordnet_antonyms', 'task1509_evalution_antonyms', 'task150_afs_argument_quality_gun_control', 'task1510_evalution_relation_extraction', 'task1517_limit_classfication', 'task1518_limit_answer_generation', 'task1519_qa_srl_question_generation', 'task151_tomqa_find_location_easy_clean', 'task1520_qa_srl_answer_generation', 'task152_tomqa_find_location_easy_noise', 'task153_tomqa_find_location_hard_clean', 'task1541_agnews_classification', 'task1542_every_ith_element_from_starting', 'task1548_wiqa_binary_classification', 'task1549_wiqa_answer_generation_missing_step', 'task154_tomqa_find_location_hard_noise', 'task1551_every_ith_element_from_kth_element', 'task1553_cnn_dailymail_summarization', 'task1559_blimp_binary_classification', 'task155_count_nouns_verbs', 'task1560_blimp_binary_classification', 'task1564_triviaqa_answer_generation', 'task1565_triviaqa_classification', 'task1566_propara_structured_text_generation', 'task1567_propara_question_generation', 'task1568_propara_classification', 'task156_codah_classification_adversarial', 'task1572_samsum_summary', 'task1573_samsum_classification', 'task157_count_vowels_and_consonants', 'task1580_eqasc-perturbed_question_generation', 'task1581_eqasc-perturbed_answer_generation', 'task1582_bless_hypernym_generation', 'task1583_bless_meronym_classification', 'task1584_evalution_meronym_classification', 'task1585_root09_hypernym_generation', 'task158_count_frequency_of_words', 'task1590_diplomacy_text_generation', 'task1592_yahoo_answers_topics_classfication', 'task1593_yahoo_answers_topics_classification', 'task1594_yahoo_answers_topics_question_generation', 'task1595_event2mind_text_generation_1', 'task1596_event2mind_text_generation_2', 'task1599_smcalflow_classification', 'task159_check_frequency_of_words_in_sentence_pair', 'task1600_smcalflow_sentence_generation', 'task1601_webquestions_answer_generation', 'task1602_webquestion_question_genreation', 'task1603_smcalflow_sentence_generation', 'task1604_ethos_text_classification', 'task1605_ethos_text_classification', 'task1606_ethos_text_classification', 'task1607_ethos_text_classification', 'task1608_xquad_en_answer_generation', 'task1609_xquad_en_question_generation', 'task160_replace_letter_in_a_sentence', 'task161_count_words_containing_letter', 'task162_count_words_starting_with_letter', 'task163_count_words_ending_with_letter', 'task1645_medical_question_pair_dataset_text_classification', 'task164_mcscript_question_answering_text', 'task1656_gooaq_answer_generation', 'task1657_gooaq_question_generation', 'task165_mcscript_question_answering_commonsense', 'task1660_super_glue_question_generation', 'task1661_super_glue_classification', 'task1665_trainglecopa_question_generation', 'task1669_md_gender_bias_text_modification', 'task166_clariq_sentence_generation', 'task1670_md_gender_bias_text_modification', 'task1678_mathqa_answer_selection', 'task167_strategyqa_question_generation', 'task168_strategyqa_question_decomposition', 'task169_strategyqa_sentence_generation', 'task1703_ljspeech_textmodification', 'task1704_ljspeech_textmodification', 'task1705_ljspeech_classification', 'task1706_ljspeech_classification', 'task170_hotpotqa_answer_generation', 'task1711_poki_text_generation', 'task1712_poki_classification', 'task1713_convai3_sentence_generation', 'task1714_convai3_sentence_generation', 'task1720_civil_comments_toxicity_classification', 'task1721_civil_comments_obscenity_classification', 'task1722_civil_comments_threat_classification', 'task1723_civil_comments_sexuallyexplicit_classification', 'task1724_civil_comments_insult_classification', 'task1725_civil_comments_severtoxicity_classification', 'task1726_mathqa_correct_answer_generation', 'task1727_wiqa_what_is_the_effect', 'task1729_personachat_generate_next', 'task1730_personachat_choose_next', 'task1731_quartz_question_answering', 'task176_break_decompose_questions', 'task177_para-nmt_paraphrasing', 'task178_quartz_question_answering', 'task179_participant_extraction', 'task180_intervention_extraction', 'task181_outcome_extraction', 'task182_duorc_question_generation', 'task183_rhyme_generation', 'task184_break_generate_question', 'task191_hotpotqa_question_generation', 'task192_hotpotqa_sentence_generation', 'task193_duorc_question_generation', 'task194_duorc_answer_generation', 'task195_sentiment140_classification', 'task196_sentiment140_answer_generation', 'task205_remove_even_elements', 'task206_collatz_conjecture', 'task207_max_element_lists', 'task208_combinations_of_list', 'task209_stancedetection_classification', 'task210_logic2text_structured_text_generation', 'task211_logic2text_classification', 'task212_logic2text_classification', 'task223_quartz_explanation_generation', 'task227_clariq_classification', 'task228_arc_answer_generation_easy', 'task229_arc_answer_generation_hard', 'task243_count_elements_in_set_intersection', 'task244_count_elements_in_set_union', 'task245_check_presence_in_set_intersection', 'task246_dream_question_generation', 'task247_dream_answer_generation', 'task248_dream_classification', 'task267_concatenate_and_reverse_all_elements_from_index_i_to_j', 'task268_casehold_legal_answer_generation', 'task269_csrg_counterfactual_story_generation', 'task270_csrg_counterfactual_context_generation', 'task274_overruling_legal_classification', 'task275_enhanced_wsc_paraphrase_generation', 'task276_enhanced_wsc_classification', 'task277_stereoset_sentence_generation_stereotype', 'task278_stereoset_sentence_generation_antistereotype', 'task279_stereoset_classification_stereotype', 'task280_stereoset_classification_stereotype_type', 'task283_dream_incorrect_answer_generation', 'task284_imdb_classification', 'task285_imdb_answer_generation', 'task286_olid_offense_judgment', 'task287_casehold_legal_incorrect_answer_generation', 'task291_semeval_2020_task4_commonsense_validation', 'task292_storycommonsense_character_text_generation', 'task293_storycommonsense_emotion_text_generation', 'task294_storycommonsense_motiv_text_generation', 'task295_semeval_2020_task4_commonsense_reasoning', 'task296_storycloze_correct_end_classification', 'task297_storycloze_incorrect_end_classification', 'task298_storycloze_correct_end_classification', 'task299_storycloze_sentence_generation', 'task300_storycloze_order_generation', 'task301_record_question_generation', 'task302_record_classification', 'task303_record_incorrect_answer_generation', 'task305_jeopardy_answer_generation_normal', 'task306_jeopardy_answer_generation_double', 'task307_jeopardy_answer_generation_final', 'task308_jeopardy_answer_generation_all', 'task309_race_answer_generation', 'task310_race_classification', 'task311_race_question_generation', 'task316_crows-pairs_classification_stereotype', 'task317_crows-pairs_classification_stereotype_type', 'task318_stereoset_classification_gender', 'task319_stereoset_classification_profession', 'task320_stereoset_classification_race', 'task321_stereoset_classification_religion', 'task322_jigsaw_classification_threat', 'task323_jigsaw_classification_sexually_explicit', 'task324_jigsaw_classification_disagree', 'task325_jigsaw_classification_identity_attack', 'task326_jigsaw_classification_obscene', 'task327_jigsaw_classification_toxic', 'task328_jigsaw_classification_insult', 'task333_hateeval_classification_hate_en', 'task335_hateeval_classification_aggresive_en', 'task337_hateeval_classification_individual_en', 'task339_record_answer_generation', 'task340_winomt_classification_gender_pro', 'task341_winomt_classification_gender_anti', 'task342_winomt_classification_profession_pro', 'task343_winomt_classification_profession_anti', 'task344_hybridqa_answer_generation', 'task345_hybridqa_answer_generation', 'task346_hybridqa_classification', 'task347_hybridqa_incorrect_answer_generation', 'task350_winomt_classification_gender_identifiability_pro', 'task351_winomt_classification_gender_identifiability_anti', 'task353_casino_classification_negotiation_elicit_pref', 'task354_casino_classification_negotiation_no_need', 'task355_casino_classification_negotiation_other_need', 'task356_casino_classification_negotiation_self_need', 'task357_casino_classification_negotiation_small_talk', 'task358_casino_classification_negotiation_uv_part', 'task359_casino_classification_negotiation_vouch_fair', 'task363_sst2_polarity_classification', 'task364_regard_social_impact_classification', 'task365_synthetic_remove_vowels', 'task366_synthetic_return_primes', 'task367_synthetic_remove_floats', 'task368_synthetic_even_or_odd_calculation', 'task369_synthetic_remove_odds', 'task370_synthetic_remove_divisible_by_3', 'task371_synthetic_product_of_list', 'task372_synthetic_palindrome_numbers', 'task373_synthetic_round_tens_place', 'task374_synthetic_pos_or_neg_calculation', 'task375_classify_type_of_sentence_in_debate', 'task376_reverse_order_of_words', 'task377_remove_words_of_given_length', 'task378_reverse_words_of_given_length', 'task379_agnews_topic_classification', 'task380_boolq_yes_no_question', 'task381_boolq_question_generation', 'task382_hybridqa_answer_generation', 'task383_matres_classification', 'task384_socialiqa_question_classification', 'task385_socialiqa_incorrect_answer_generation', 'task386_semeval_2018_task3_irony_detection', 'task387_semeval_2018_task3_irony_classification', 'task388_torque_token_classification', 'task389_torque_generate_temporal_question', 'task390_torque_text_span_selection', 'task397_semeval_2018_task1_tweet_anger_detection', 'task398_semeval_2018_task1_tweet_joy_detection', 'task399_semeval_2018_task1_tweet_sadness_detection', 'task400_paws_paraphrase_classification', 'task403_creak_commonsense_inference', 'task405_narrativeqa_question_generation', 'task413_mickey_en_sentence_perturbation_generation', 'task428_senteval_inversion', 'task429_senteval_tense', 'task430_senteval_subject_count', 'task431_senteval_object_count', 'task453_swag_answer_generation', 'task454_swag_incorrect_answer_generation', 'task455_swag_context_generation', 'task456_matres_intention_classification', 'task457_matres_conditional_classification', 'task458_matres_negation_classification', 'task459_matres_static_classification', 'task460_qasper_answer_generation', 'task461_qasper_question_generation', 'task462_qasper_classification', 'task469_mrqa_answer_generation', 'task470_mrqa_question_generation', 'task471_haspart_answer_generation', 'task472_haspart_classification', 'task475_yelp_polarity_classification', 'task476_cls_english_books_classification', 'task477_cls_english_dvd_classification', 'task478_cls_english_music_classification', 'task488_extract_all_alphabetical_elements_from_list_in_order', 'task489_mwsc_question_generation', 'task490_mwsc_options_generation', 'task491_mwsc_answer_generation', 'task492_mwsc_incorrect_answer_generation', 'task493_review_polarity_classification', 'task494_review_polarity_answer_generation', 'task495_semeval_headline_classification', 'task496_semeval_answer_generation', 'task497_extract_all_numbers_from_list_in_order', 'task499_extract_and_add_all_numbers_from_list', 'task504_count_all_alphabetical_elements_in_list', 'task505_count_all_numerical_elements_in_list', 'task506_position_of_all_alphabetical_elements_in_list', 'task507_position_of_all_numerical_elements_in_list', 'task509_collate_of_all_alphabetical_and_numerical_elements_in_list_separately', 'task512_twitter_emotion_classification', 'task513_argument_stance_classification', 'task514_argument_consequence_classification', 'task515_senteval_odd_word_out', 'task516_senteval_conjoints_inversion', 'task517_emo_classify_emotion_of_dialogue', 'task518_emo_different_dialogue_emotions', 'task521_trivia_question_classification', 'task522_news_editorial_summary', 'task523_find_if_numbers_or_alphabets_are_more_in_list', 'task547_alt_translation_entk_en', 'task550_discofuse_sentence_generation', 'task560_alt_translation_en_entk', 'task563_discofuse_answer_generation', 'task564_discofuse_classification', 'task565_circa_answer_generation', 'task566_circa_classification', 'task567_circa_text_generation', 'task568_circa_question_generation', 'task573_air_dialogue_classification', 'task574_air_dialogue_sentence_generation', 'task575_air_dialogue_classification', 'task576_curiosity_dialogs_answer_generation', 'task577_curiosity_dialogs_classification', 'task578_curiosity_dialogs_answer_generation', 'task579_socialiqa_classification', 'task580_socialiqa_answer_generation', 'task581_socialiqa_question_generation', 'task582_naturalquestion_answer_generation', 'task583_udeps_eng_coarse_pos_tagging', 'task584_udeps_eng_fine_pos_tagging', 'task585_preposition_classification', 'task586_amazonfood_polarity_classification', 'task587_amazonfood_polarity_correction_classification', 'task588_amazonfood_rating_classification', 'task589_amazonfood_summary_text_generation', 'task590_amazonfood_summary_correction_classification', 'task591_sciq_answer_generation', 'task592_sciq_incorrect_answer_generation', 'task593_sciq_explanation_generation', 'task594_sciq_question_generation', 'task595_mocha_answer_generation', 'task596_mocha_question_generation', 'task597_cuad_answer_generation', 'task598_cuad_answer_generation', 'task599_cuad_question_generation', 'task600_find_the_longest_common_substring_in_two_strings', 'task605_find_the_longest_common_subsequence_in_two_lists', 'task606_sum_of_all_numbers_in_list_between_positions_i_and_j', 'task607_sbic_intentional_offense_binary_classification', 'task608_sbic_sexual_offense_binary_classification', 'task609_sbic_potentially_offense_binary_classification', 'task610_conllpp_ner', 'task611_mutual_multi_turn_dialogue', 'task615_moviesqa_answer_generation', 'task616_cola_classification', 'task617_amazonreview_category_text_generation', 'task618_amazonreview_summary_text_generation', 'task622_replace_alphabets_in_a_list_by_their_position_in_english_alphabet', 'task625_xlwic_true_or_false_answer_generation', 'task626_xlwic_sentence_based_on_given_word_sentence_generation', 'task627_xlwic_word_with_same_meaning_sentence_generation', 'task628_xlwic_word_with_different_meaning_sentence_generation', 'task629_dbpedia_14_classification', 'task630_dbpedia_14_classification', 'task631_dbpedia_14_incorrect_answer_generation', 'task632_dbpedia_14_classification', 'task633_dbpedia_14_answer_generation', 'task636_extract_and_sort_unique_alphabets_in_a_list', 'task637_extract_and_sort_unique_digits_in_a_list', 'task638_multi_woz_classification', 'task639_multi_woz_user_utterance_generation', 'task649_race_blank_question_generation', 'task664_mmmlu_answer_generation_abstract_algebra', 'task665_mmmlu_answer_generation_anatomy', 'task666_mmmlu_answer_generation_astronomy', 'task667_mmmlu_answer_generation_business_ethics', 'task668_extreme_abstract_summarization', 'task672_amazon_and_yelp_summarization_dataset_summarization', 'task672_nummersense', 'task673_google_wellformed_query_classification', 'task674_google_wellformed_query_sentence_generation', 'task675_google_wellformed_query_sentence_generation', 'task679_hope_edi_english_text_classification', 'task681_hope_edi_malayalam_text_classification', 'task682_online_privacy_policy_text_classification', 'task683_online_privacy_policy_text_purpose_answer_generation', 'task684_online_privacy_policy_text_information_type_generation', 'task685_mmmlu_answer_generation_clinical_knowledge', 'task686_mmmlu_answer_generation_college_biology', 'task687_mmmlu_answer_generation_college_chemistry', 'task688_mmmlu_answer_generation_college_computer_science', 'task689_mmmlu_answer_generation_college_mathematics', 'task690_mmmlu_answer_generation_college_medicine', 'task691_mmmlu_answer_generation_college_physics', 'task692_mmmlu_answer_generation_computer_security', 'task693_mmmlu_answer_generation_conceptual_physics', 'task694_mmmlu_answer_generation_econometrics', 'task695_mmmlu_answer_generation_electrical_engineering', 'task696_mmmlu_answer_generation_elementary_mathematics', 'task697_mmmlu_answer_generation_formal_logic', 'task698_mmmlu_answer_generation_global_facts', 'task699_mmmlu_answer_generation_high_school_biology', 'task700_mmmlu_answer_generation_high_school_chemistry', 'task701_mmmlu_answer_generation_high_school_computer_science', 'task702_mmmlu_answer_generation_high_school_european_history', 'task703_mmmlu_answer_generation_high_school_geography', 'task704_mmmlu_answer_generation_high_school_government_and_politics', 'task705_mmmlu_answer_generation_high_school_macroeconomics', 'task706_mmmlu_answer_generation_high_school_mathematics', 'task707_mmmlu_answer_generation_high_school_microeconomics', 'task708_mmmlu_answer_generation_high_school_physics', 'task709_mmmlu_answer_generation_high_school_psychology', 'task710_mmmlu_answer_generation_high_school_statistics', 'task711_mmmlu_answer_generation_high_school_us_history', 'task712_mmmlu_answer_generation_high_school_world_history', 'task713_mmmlu_answer_generation_human_aging', 'task714_mmmlu_answer_generation_human_sexuality', 'task715_mmmlu_answer_generation_international_law', 'task716_mmmlu_answer_generation_jurisprudence', 'task717_mmmlu_answer_generation_logical_fallacies', 'task718_mmmlu_answer_generation_machine_learning', 'task719_mmmlu_answer_generation_management', 'task720_mmmlu_answer_generation_marketing', 'task721_mmmlu_answer_generation_medical_genetics', 'task722_mmmlu_answer_generation_random_topic', 'task723_mmmlu_answer_generation_moral_disputes', 'task724_mmmlu_answer_generation_moral_scenarios', 'task725_mmmlu_answer_generation_nutrition', 'task726_mmmlu_answer_generation_philosophy', 'task727_mmmlu_answer_generation_prehistory', 'task728_mmmlu_answer_generation_professional_accounting', 'task729_mmmlu_answer_generation_professional_law', 'task730_mmmlu_answer_generation_professional_medicine', 'task731_mmmlu_answer_generation_professional_psychology', 'task732_mmmlu_answer_generation_public_relations', 'task733_mmmlu_answer_generation_security_studies', 'task734_mmmlu_answer_generation_sociology', 'task735_mmmlu_answer_generation_us_foreign_policy', 'task736_mmmlu_answer_generation_virology', 'task737_mmmlu_answer_generation_world_religions', 'task739_lhoestq_question_generation', 'task740_lhoestq_answer_generation_quantity', 'task741_lhoestq_answer_generation_place', 'task742_lhoestq_answer_generation_frequency', 'task745_ai2_arithmetic_questions_arithmetic', 'task746_yelp_restaurant_review_classification', 'task750_aqua_multiple_choice_answering', 'task751_svamp_subtraction_question_answering', 'task752_svamp_multiplication_question_answering', 'task753_svamp_addition_question_answering', 'task754_svamp_common-division_question_answering', 'task755_find_longest_substring_and_replace_its_sorted_lowercase_version_in_both_lists', 'task756_find_longert_substring_and_return_all_unique_alphabets_in_it', 'task761_app_review_classification', 'task766_craigslist_bargains_classification', 'task767_craigslist_bargains_classification', 'task770_pawsx_english_text_modification', 'task819_pec_sentiment_classification', 'task820_protoqa_answer_generation', 'task821_protoqa_question_generation', 'task823_peixian-rtgender_sentiment_analysis', 'task833_poem_sentiment_classification', 'task834_mathdataset_classification', 'task835_mathdataset_answer_generation', 'task843_financial_phrasebank_classification', 'task844_financial_phrasebank_classification', 'task845_pubmedqa_question_generation', 'task846_pubmedqa_classification', 'task847_pubmedqa_question_generation', 'task848_pubmedqa_classification', 'task849_pubmedqa_answer_generation', 'task850_synthetic_longest_palindrome', 'task851_synthetic_multiply_evens', 'task852_synthetic_multiply_odds', 'task853_hippocorpus_long_text_generation', 'task854_hippocorpus_classification', 'task855_conv_ai_2_classification', 'task856_conv_ai_2_classification', 'task857_inquisitive_question_generation', 'task858_inquisitive_span_detection', 'task859_prost_question_generation', 'task860_prost_mcq_generation', 'task861_asdiv_addsub_question_answering', 'task861_prost_mcq_answers_generation', 'task862_asdiv_multidiv_question_answering', 'task863_asdiv_multiop_question_answering', 'task864_asdiv_singleop_question_answering', 'task865_mawps_addsub_question_answering', 'task866_mawps_multidiv_question_answering', 'task867_mawps_multiop_question_answering', 'task868_cfq_mcd1_explanation_to_sql', 'task868_mawps_singleop_question_answering', 'task869_cfq_mcd1_sql_to_explanation', 'task870_msmarco_answer_generation', 'task871_msmarco_question_generation', 'task874_opus_xhosanavy_sr', 'task875_emotion_classification', 'task886_quail_question_generation', 'task887_quail_answer_generation', 'task888_reviews_classification', 'task889_goemotions_classification', 'task897_freebase_qa_topic_question_generation', 'task898_freebase_qa_answer_generation', 'task899_freebase_qa_topic_generation', 'task900_freebase_qa_category_classification', 'task901_freebase_qa_category_question_generation', 'task902_deceptive_opinion_spam_classification', 'task903_deceptive_opinion_spam_classification', 'task904_hate_speech_offensive_classification', 'task905_hate_speech_offensive_classification', 'task906_dialogre_identify_names', 'task907_dialogre_identify_relationships', 'task908_dialogre_identify_familial_relationships', 'task909_dialogre_prevalent_speakers', 'task917_coqa_question_generation', 'task918_coqa_answer_generation', 'task919_coqa_incorrect_answer_generation', 'task921_code_x_glue_information_retreival', 'task922_event2mind_word_generation', 'task923_event2mind_classifier', 'task924_event2mind_word_generation', 'task925_coached_conv_pref_classifier', 'task926_coached_conv_pref_word_generation', 'task927_yelp_negative_to_positive_style_transfer', 'task928_yelp_positive_to_negative_style_transfer', 'task929_products_reviews_classification', 'task933_wiki_auto_style_transfer', 'task934_turk_simplification', 'task955_wiki_auto_style_transfer', 'task956_leetcode_420_strong_password_check', 'task963_librispeech_asr_next_word_prediction', 'task964_librispeech_asr_text_auto_completion', 'task965_librispeech_asr_missing_word_prediction', 'task966_ruletaker_fact_checking_based_on_given_context', 'task967_ruletaker_incorrect_fact_generation_based_on_given_paragraph'] ``` Validation Tasks: ``` ['task1333_check_validity_date_ddmmyyyy', 'task1403_check_validity_date_mmddyyyy', 'task291_semeval_2020_task4_commonsense_validation'] ``` Test Tasks: ``` ['task020_mctaco_span_based_question', 'task033_winogrande_answer_generation', 'task034_winogrande_question_modification_object', 'task035_winogrande_question_modification_person', 'task036_qasc_topic_word_to_generate_related_fact', 'task039_qasc_find_overlapping_words', 'task050_multirc_answerability', 'task102_commongen_sentence_generation', 'task104_semeval_2019_task10_closed_vocabulary_mathematical_answer_generation', 'task1152_bard_analogical_reasoning_causation', 'task1153_bard_analogical_reasoning_affordance', 'task1154_bard_analogical_reasoning_travel', 'task1155_bard_analogical_reasoning_trash_or_treasure', 'task1156_bard_analogical_reasoning_tools', 'task1157_bard_analogical_reasoning_rooms_for_containers', 'task1158_bard_analogical_reasoning_manipulating_items', 'task1159_bard_analogical_reasoning_containers', 'task1161_coda19_title_generation', 'task118_semeval_2019_task10_open_vocabulary_mathematical_answer_generation', 'task1195_disflqa_disfluent_to_fluent_conversion', 'task119_semeval_2019_task10_geometric_mathematical_answer_generation', 'task121_zest_text_modification', 'task1336_peixian_equity_evaluation_corpus_gender_classifier', 'task1338_peixian_equity_evaluation_corpus_sentiment_classifier', 'task1339_peixian_equity_evaluation_corpus_text_completion', 'task133_winowhy_reason_plausibility_detection', 'task1342_amazon_us_reviews_title', 'task1344_glue_entailment_classification', 'task1345_glue_qqp_question_paraprashing', 'task1356_xlsum_title_generation', 'task1358_xlsum_title_generation', 'task1385_anli_r1_entailment', 'task1386_anli_r2_entailment', 'task1387_anli_r3_entailment', 'task1388_cb_entailment', 'task1390_wscfixed_coreference', 'task1391_winogrande_easy_answer_generation', 'task1393_superglue_copa_text_completion', 'task1394_meta_woz_task_classification', 'task1407_dart_question_generation', 'task1409_dart_text_generation', 'task1429_evalution_semantic_relation_classification', 'task1439_doqa_cooking_isanswerable', 'task1442_doqa_movies_isanswerable', 'task1509_evalution_antonyms', 'task1510_evalution_relation_extraction', 'task1516_imppres_naturallanguageinference', 'task1529_scitail1.1_classification', 'task1531_daily_dialog_type_classification', 'task1533_daily_dialog_formal_classification', 'task1534_daily_dialog_question_classification', 'task1540_parsed_pdfs_summarization', 'task1554_scitail_classification', 'task1557_jfleg_answer_generation', 'task1562_zest_text_modification', 'task1584_evalution_meronym_classification', 'task1586_scifact_title_generation', 'task1598_nyc_long_text_generation', 'task1612_sick_label_classification', 'task1615_sick_tclassify_b_relation_a', 'task1622_disfl_qa_text_modication', 'task1624_disfl_qa_question_yesno_classification', 'task1631_openpi_answer_generation', 'task1640_aqa1.0_answerable_unanswerable_question_classification', 'task1659_title_generation', 'task1664_winobias_text_generation', 'task1728_web_nlg_data_to_text', 'task190_snli_classification', 'task199_mnli_classification', 'task200_mnli_entailment_classification', 'task201_mnli_neutral_classification', 'task202_mnli_contradiction_classification', 'task219_rocstories_title_answer_generation', 'task220_rocstories_title_classification', 'task226_english_language_answer_relevance_classification', 'task232_iirc_link_number_classification', 'task233_iirc_link_exists_classification', 'task242_tweetqa_classification', 'task249_enhanced_wsc_pronoun_disambiguation', 'task281_points_of_correspondence', 'task288_gigaword_summarization', 'task290_tellmewhy_question_answerability', 'task291_semeval_2020_task4_commonsense_validation', 'task295_semeval_2020_task4_commonsense_reasoning', 'task304_numeric_fused_head_resolution', 'task329_gap_classification', 'task330_gap_answer_generation', 'task333_hateeval_classification_hate_en', 'task335_hateeval_classification_aggresive_en', 'task337_hateeval_classification_individual_en', 'task349_squad2.0_answerable_unanswerable_question_classification', 'task362_spolin_yesand_prompt_response_sub_classification', 'task386_semeval_2018_task3_irony_detection', 'task387_semeval_2018_task3_irony_classification', 'task391_causal_relationship', 'task392_inverse_causal_relationship', 'task393_plausible_result_generation', 'task397_semeval_2018_task1_tweet_anger_detection', 'task398_semeval_2018_task1_tweet_joy_detection', 'task399_semeval_2018_task1_tweet_sadness_detection', 'task401_numeric_fused_head_reference', 'task402_grailqa_paraphrase_generation', 'task418_persent_title_generation', 'task428_senteval_inversion', 'task429_senteval_tense', 'task430_senteval_subject_count', 'task431_senteval_object_count', 'task442_com_qa_paraphrase_question_generation', 'task495_semeval_headline_classification', 'task496_semeval_answer_generation', 'task500_scruples_anecdotes_title_generation', 'task510_reddit_tifu_title_summarization', 'task515_senteval_odd_word_out', 'task516_senteval_conjoints_inversion', 'task520_aquamuse_answer_given_in_passage', 'task569_recipe_nlg_text_generation', 'task602_wikitext-103_answer_generation', 'task613_politifact_text_generation', 'task614_glucose_cause_event_detection', 'task619_ohsumed_abstract_title_generation', 'task620_ohsumed_medical_subject_headings_answer_generation', 'task623_ohsumed_yes_no_answer_generation', 'task640_esnli_classification', 'task641_esnli_classification', 'task642_esnli_classification', 'task645_summarization', 'task648_answer_generation', 'task670_ambigqa_question_generation', 'task671_ambigqa_text_generation', 'task677_ollie_sentence_answer_generation', 'task738_perspectrum_classification', 'task743_eurlex_summarization', 'task760_msr_sqa_long_text_generation', 'task769_qed_summarization', 'task827_copa_commonsense_reasoning', 'task828_copa_commonsense_cause_effect', 'task879_schema_guided_dstc8_classification', 'task880_schema_guided_dstc8_classification', 'task890_gcwd_classification', 'task891_gap_coreference_resolution', 'task892_gap_reverse_coreference_resolution', 'task893_gap_fill_the_blank_coreference_resolution', 'task909_dialogre_prevalent_speakers', 'task935_defeasible_nli_atomic_classification', 'task936_defeasible_nli_snli_classification', 'task937_defeasible_nli_social_classification', 'task957_e2e_nlg_text_generation_generate', 'task970_sherliic_causal_relationship'] ```
autoevaluate/autoeval-staging-eval-project-d42d3c12-7815009
--- type: predictions tags: - autotrain - evaluation datasets: - xtreme eval_info: task: entity_extraction model: olpa/xlm-roberta-base-finetuned-panx-de metrics: [] dataset_name: xtreme dataset_config: PAN-X.de dataset_split: test col_mapping: tokens: tokens tags: ner_tags --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Token Classification * Model: olpa/xlm-roberta-base-finetuned-panx-de * Dataset: xtreme To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model.
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_58
--- dataset_info: features: - name: logits sequence: float32 - name: mfcc sequence: sequence: float64 splits: - name: train num_bytes: 1299906128.0 num_examples: 255284 download_size: 1319571967 dataset_size: 1299906128.0 --- # Dataset Card for "chunk_58" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
darkcleopas/jigsaw-toxic-comment-multi-binary
--- license: cc0-1.0 ---
yezhengli9/wmt20-km-en
--- dataset_info: features: - name: id (string) dtype: string - name: translation (translation) dtype: string splits: - name: train num_bytes: 1638851 num_examples: 2320 download_size: 626576 dataset_size: 1638851 --- # Dataset Card for "wmt20-km-en" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Narayana02/Accident-data
--- license: apache-2.0 task_categories: - text2text-generation ---
open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v16.3-32k
--- pretty_name: Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v16.3-32k dataset_summary: "Dataset automatically created during the evaluation run of model\ \ [OpenBuddy/openbuddy-mixtral-7bx8-v16.3-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v16.3-32k)\ \ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\ \nThe dataset is composed of 63 configuration, each one coresponding to one of the\ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\ \ found as a specific split in each configuration, the split being named using the\ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\ \nAn additional configuration \"results\" store all the aggregated results of the\ \ run (and is used to compute and display the aggregated metrics on the [Open LLM\ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\ \nTo load the details from a run, you can for instance do the following:\n```python\n\ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v16.3-32k\"\ ,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\ These are the [latest results from run 2024-01-05T04:57:41.818907](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v16.3-32k/blob/main/results_2024-01-05T04-57-41.818907.json)(note\ \ that their might be results for other tasks in the repos if successive evals didn't\ \ cover the same tasks. You find each in the results and the \"latest\" split for\ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6992764990088943,\n\ \ \"acc_stderr\": 0.030080190218914955,\n \"acc_norm\": 0.7136773526591422,\n\ \ \"acc_norm_stderr\": 0.030895052800428254,\n \"mc1\": 0.3953488372093023,\n\ \ \"mc1_stderr\": 0.017115815632418187,\n \"mc2\": 0.5638733199382533,\n\ \ \"mc2_stderr\": 0.014806158821537194\n },\n \"harness|arc:challenge|25\"\ : {\n \"acc\": 0.22781569965870307,\n \"acc_stderr\": 0.01225670860232692,\n\ \ \"acc_norm\": 0.2645051194539249,\n \"acc_norm_stderr\": 0.012889272949313368\n\ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6227843059151563,\n\ \ \"acc_stderr\": 0.004836990373261572,\n \"acc_norm\": 0.8083051185022904,\n\ \ \"acc_norm_stderr\": 0.003928298121755031\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n \ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\ \ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n\ \ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\ : {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\ \ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\ \ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \ \ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\ : {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100827,\n\ \ \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100827\n\ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\ \ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\ \ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\ : {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \ \ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \ \ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\ : 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\ \ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\ : {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \ \ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\ \ \"acc_stderr\": 0.03496101481191179,\n \"acc_norm\": 0.6994219653179191,\n\ \ \"acc_norm_stderr\": 0.03496101481191179\n },\n \"harness|hendrycksTest-college_physics|5\"\ : {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.049598599663841815,\n\ \ \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.049598599663841815\n\ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\ \ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\ \ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\ : {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n\ \ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n\ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\ \ \"acc_stderr\": 0.045796394220704355,\n \"acc_norm\": 0.6140350877192983,\n\ \ \"acc_norm_stderr\": 0.045796394220704355\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\ : {\n \"acc\": 0.7103448275862069,\n \"acc_stderr\": 0.03780019230438015,\n\ \ \"acc_norm\": 0.7103448275862069,\n \"acc_norm_stderr\": 0.03780019230438015\n\ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\ : 0.5052910052910053,\n \"acc_stderr\": 0.02574986828855657,\n \"\ acc_norm\": 0.5052910052910053,\n \"acc_norm_stderr\": 0.02574986828855657\n\ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n\ \ \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n\ \ \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\"\ : {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \ \ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\ : 0.8516129032258064,\n \"acc_stderr\": 0.020222737554330378,\n \"\ acc_norm\": 0.8516129032258064,\n \"acc_norm_stderr\": 0.020222737554330378\n\ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\ : 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406796,\n \"\ acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406796\n\ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \ \ \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\"\ : 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\ : {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284332,\n\ \ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284332\n\ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\ : 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"\ acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n\ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\ \ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\ \ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \ \ \"acc\": 0.7076923076923077,\n \"acc_stderr\": 0.023060438380857733,\n\ \ \"acc_norm\": 0.7076923076923077,\n \"acc_norm_stderr\": 0.023060438380857733\n\ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\ acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \ \ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n\ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \ \ \"acc\": 0.8151260504201681,\n \"acc_stderr\": 0.025215992877954202,\n\ \ \"acc_norm\": 0.8151260504201681,\n \"acc_norm_stderr\": 0.025215992877954202\n\ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\ : 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\ acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\ \ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\ : 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958786,\n \"\ acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958786\n\ \ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\ : 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\ acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\ : 0.8725490196078431,\n \"acc_stderr\": 0.02340553048084631,\n \"\ acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.02340553048084631\n\ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\ acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \ \ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n\ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\ \ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\ \ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\ : {\n \"acc\": 0.8244274809160306,\n \"acc_stderr\": 0.03336820338476076,\n\ \ \"acc_norm\": 0.8244274809160306,\n \"acc_norm_stderr\": 0.03336820338476076\n\ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\ \ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\ : 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\ \ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\ \ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\ \ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\ : {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\ \ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n\ \ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n\ \ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\ : {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n\ \ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n\ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\ \ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\ \ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\ : {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \ \ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8786717752234994,\n\ \ \"acc_stderr\": 0.01167591388390672,\n \"acc_norm\": 0.8786717752234994,\n\ \ \"acc_norm_stderr\": 0.01167591388390672\n },\n \"harness|hendrycksTest-moral_disputes|5\"\ : {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n\ \ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n\ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48156424581005586,\n\ \ \"acc_stderr\": 0.016711130497782816,\n \"acc_norm\": 0.48156424581005586,\n\ \ \"acc_norm_stderr\": 0.016711130497782816\n },\n \"harness|hendrycksTest-nutrition|5\"\ : {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\ \ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8006430868167203,\n\ \ \"acc_stderr\": 0.022691033780549656,\n \"acc_norm\": 0.8006430868167203,\n\ \ \"acc_norm_stderr\": 0.022691033780549656\n },\n \"harness|hendrycksTest-prehistory|5\"\ : {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.021613809395224805,\n\ \ \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.021613809395224805\n\ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\ acc\": 0.5531914893617021,\n \"acc_stderr\": 0.029658235097666907,\n \ \ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.029658235097666907\n\ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5208604954367666,\n\ \ \"acc_stderr\": 0.01275911706651801,\n \"acc_norm\": 0.5208604954367666,\n\ \ \"acc_norm_stderr\": 0.01275911706651801\n },\n \"harness|hendrycksTest-professional_medicine|5\"\ : {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.02456220431414231,\n\ \ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.02456220431414231\n\ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\ acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427657,\n \ \ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427657\n\ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\ \ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \ \ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\ : {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n\ \ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n\ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\ \ \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n\ \ \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\ : {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \ \ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n \ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\ \ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\ \ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\ : {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\ \ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n\ \ \"mc1_stderr\": 0.017115815632418187,\n \"mc2\": 0.5638733199382533,\n\ \ \"mc2_stderr\": 0.014806158821537194\n },\n \"harness|winogrande|5\"\ : {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025397\n\ \ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\ : 0.0\n }\n}\n```" repo_url: https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v16.3-32k leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard point_of_contact: clementine@hf.co configs: - config_name: harness_arc_challenge_25 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|arc:challenge|25_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|arc:challenge|25_2024-01-05T04-57-41.818907.parquet' - config_name: harness_gsm8k_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|gsm8k|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|gsm8k|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hellaswag_10 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hellaswag|10_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hellaswag|10_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-57-41.818907.parquet' - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_abstract_algebra_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_anatomy_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-anatomy|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_astronomy_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-astronomy|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_business_ethics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-business_ethics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_clinical_knowledge_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_college_biology_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_biology|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_college_chemistry_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_college_computer_science_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_college_mathematics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_college_medicine_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_medicine|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_college_physics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-college_physics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_computer_security_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-computer_security|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_conceptual_physics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_econometrics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-econometrics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_electrical_engineering_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_elementary_mathematics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_formal_logic_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-formal_logic|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_global_facts_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-global_facts|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_biology_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_chemistry_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_computer_science_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_european_history_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_geography_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_government_and_politics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_macroeconomics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_mathematics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_microeconomics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_physics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_psychology_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_statistics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_us_history_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_high_school_world_history_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_human_aging_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_aging|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_human_sexuality_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_international_law_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-international_law|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_jurisprudence_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_logical_fallacies_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_machine_learning_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-machine_learning|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_management_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-management|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_marketing_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-marketing|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_medical_genetics_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_miscellaneous_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_moral_disputes_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_moral_scenarios_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_nutrition_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-nutrition|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_philosophy_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-philosophy|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_prehistory_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-prehistory|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_professional_accounting_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_professional_law_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_law|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_professional_medicine_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_professional_psychology_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_public_relations_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-public_relations|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_security_studies_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-security_studies|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_sociology_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-sociology|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_us_foreign_policy_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_virology_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-virology|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_hendrycksTest_world_religions_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|hendrycksTest-world_religions|5_2024-01-05T04-57-41.818907.parquet' - config_name: harness_truthfulqa_mc_0 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|truthfulqa:mc|0_2024-01-05T04-57-41.818907.parquet' - config_name: harness_winogrande_5 data_files: - split: 2024_01_05T04_57_41.818907 path: - '**/details_harness|winogrande|5_2024-01-05T04-57-41.818907.parquet' - split: latest path: - '**/details_harness|winogrande|5_2024-01-05T04-57-41.818907.parquet' - config_name: results data_files: - split: 2024_01_05T04_57_41.818907 path: - results_2024-01-05T04-57-41.818907.parquet - split: latest path: - results_2024-01-05T04-57-41.818907.parquet --- # Dataset Card for Evaluation run of OpenBuddy/openbuddy-mixtral-7bx8-v16.3-32k <!-- Provide a quick summary of the dataset. --> Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-mixtral-7bx8-v16.3-32k](https://huggingface.co/OpenBuddy/openbuddy-mixtral-7bx8-v16.3-32k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard). The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task. The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results. An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)). To load the details from a run, you can for instance do the following: ```python from datasets import load_dataset data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v16.3-32k", "harness_winogrande_5", split="train") ``` ## Latest results These are the [latest results from run 2024-01-05T04:57:41.818907](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-mixtral-7bx8-v16.3-32k/blob/main/results_2024-01-05T04-57-41.818907.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval): ```python { "all": { "acc": 0.6992764990088943, "acc_stderr": 0.030080190218914955, "acc_norm": 0.7136773526591422, "acc_norm_stderr": 0.030895052800428254, "mc1": 0.3953488372093023, "mc1_stderr": 0.017115815632418187, "mc2": 0.5638733199382533, "mc2_stderr": 0.014806158821537194 }, "harness|arc:challenge|25": { "acc": 0.22781569965870307, "acc_stderr": 0.01225670860232692, "acc_norm": 0.2645051194539249, "acc_norm_stderr": 0.012889272949313368 }, "harness|hellaswag|10": { "acc": 0.6227843059151563, "acc_stderr": 0.004836990373261572, "acc_norm": 0.8083051185022904, "acc_norm_stderr": 0.003928298121755031 }, "harness|hendrycksTest-abstract_algebra|5": { "acc": 0.48, "acc_stderr": 0.05021167315686779, "acc_norm": 0.48, "acc_norm_stderr": 0.05021167315686779 }, "harness|hendrycksTest-anatomy|5": { "acc": 0.6814814814814815, "acc_stderr": 0.040247784019771096, "acc_norm": 0.6814814814814815, "acc_norm_stderr": 0.040247784019771096 }, "harness|hendrycksTest-astronomy|5": { "acc": 0.8026315789473685, "acc_stderr": 0.03238981601699397, "acc_norm": 0.8026315789473685, "acc_norm_stderr": 0.03238981601699397 }, "harness|hendrycksTest-business_ethics|5": { "acc": 0.7, "acc_stderr": 0.046056618647183814, "acc_norm": 0.7, "acc_norm_stderr": 0.046056618647183814 }, "harness|hendrycksTest-clinical_knowledge|5": { "acc": 0.8037735849056604, "acc_stderr": 0.024442388131100827, "acc_norm": 0.8037735849056604, "acc_norm_stderr": 0.024442388131100827 }, "harness|hendrycksTest-college_biology|5": { "acc": 0.8333333333333334, "acc_stderr": 0.031164899666948617, "acc_norm": 0.8333333333333334, "acc_norm_stderr": 0.031164899666948617 }, "harness|hendrycksTest-college_chemistry|5": { "acc": 0.56, "acc_stderr": 0.04988876515698589, "acc_norm": 0.56, "acc_norm_stderr": 0.04988876515698589 }, "harness|hendrycksTest-college_computer_science|5": { "acc": 0.68, "acc_stderr": 0.046882617226215034, "acc_norm": 0.68, "acc_norm_stderr": 0.046882617226215034 }, "harness|hendrycksTest-college_mathematics|5": { "acc": 0.46, "acc_stderr": 0.05009082659620333, "acc_norm": 0.46, "acc_norm_stderr": 0.05009082659620333 }, "harness|hendrycksTest-college_medicine|5": { "acc": 0.6994219653179191, "acc_stderr": 0.03496101481191179, "acc_norm": 0.6994219653179191, "acc_norm_stderr": 0.03496101481191179 }, "harness|hendrycksTest-college_physics|5": { "acc": 0.5392156862745098, "acc_stderr": 0.049598599663841815, "acc_norm": 0.5392156862745098, "acc_norm_stderr": 0.049598599663841815 }, "harness|hendrycksTest-computer_security|5": { "acc": 0.8, "acc_stderr": 0.04020151261036846, "acc_norm": 0.8, "acc_norm_stderr": 0.04020151261036846 }, "harness|hendrycksTest-conceptual_physics|5": { "acc": 0.6936170212765957, "acc_stderr": 0.03013590647851756, "acc_norm": 0.6936170212765957, "acc_norm_stderr": 0.03013590647851756 }, "harness|hendrycksTest-econometrics|5": { "acc": 0.6140350877192983, "acc_stderr": 0.045796394220704355, "acc_norm": 0.6140350877192983, "acc_norm_stderr": 0.045796394220704355 }, "harness|hendrycksTest-electrical_engineering|5": { "acc": 0.7103448275862069, "acc_stderr": 0.03780019230438015, "acc_norm": 0.7103448275862069, "acc_norm_stderr": 0.03780019230438015 }, "harness|hendrycksTest-elementary_mathematics|5": { "acc": 0.5052910052910053, "acc_stderr": 0.02574986828855657, "acc_norm": 0.5052910052910053, "acc_norm_stderr": 0.02574986828855657 }, "harness|hendrycksTest-formal_logic|5": { "acc": 0.5793650793650794, "acc_stderr": 0.04415438226743745, "acc_norm": 0.5793650793650794, "acc_norm_stderr": 0.04415438226743745 }, "harness|hendrycksTest-global_facts|5": { "acc": 0.48, "acc_stderr": 0.050211673156867795, "acc_norm": 0.48, "acc_norm_stderr": 0.050211673156867795 }, "harness|hendrycksTest-high_school_biology|5": { "acc": 0.8516129032258064, "acc_stderr": 0.020222737554330378, "acc_norm": 0.8516129032258064, "acc_norm_stderr": 0.020222737554330378 }, "harness|hendrycksTest-high_school_chemistry|5": { "acc": 0.5862068965517241, "acc_stderr": 0.03465304488406796, "acc_norm": 0.5862068965517241, "acc_norm_stderr": 0.03465304488406796 }, "harness|hendrycksTest-high_school_computer_science|5": { "acc": 0.73, "acc_stderr": 0.04461960433384739, "acc_norm": 0.73, "acc_norm_stderr": 0.04461960433384739 }, "harness|hendrycksTest-high_school_european_history|5": { "acc": 0.8424242424242424, "acc_stderr": 0.028450388805284332, "acc_norm": 0.8424242424242424, "acc_norm_stderr": 0.028450388805284332 }, "harness|hendrycksTest-high_school_geography|5": { "acc": 0.8535353535353535, "acc_stderr": 0.025190921114603918, "acc_norm": 0.8535353535353535, "acc_norm_stderr": 0.025190921114603918 }, "harness|hendrycksTest-high_school_government_and_politics|5": { "acc": 0.9430051813471503, "acc_stderr": 0.01673108529360755, "acc_norm": 0.9430051813471503, "acc_norm_stderr": 0.01673108529360755 }, "harness|hendrycksTest-high_school_macroeconomics|5": { "acc": 0.7076923076923077, "acc_stderr": 0.023060438380857733, "acc_norm": 0.7076923076923077, "acc_norm_stderr": 0.023060438380857733 }, "harness|hendrycksTest-high_school_mathematics|5": { "acc": 0.3888888888888889, "acc_stderr": 0.029723278961476664, "acc_norm": 0.3888888888888889, "acc_norm_stderr": 0.029723278961476664 }, "harness|hendrycksTest-high_school_microeconomics|5": { "acc": 0.8151260504201681, "acc_stderr": 0.025215992877954202, "acc_norm": 0.8151260504201681, "acc_norm_stderr": 0.025215992877954202 }, "harness|hendrycksTest-high_school_physics|5": { "acc": 0.45695364238410596, "acc_stderr": 0.04067325174247443, "acc_norm": 0.45695364238410596, "acc_norm_stderr": 0.04067325174247443 }, "harness|hendrycksTest-high_school_psychology|5": { "acc": 0.8880733944954129, "acc_stderr": 0.013517352714958786, "acc_norm": 0.8880733944954129, "acc_norm_stderr": 0.013517352714958786 }, "harness|hendrycksTest-high_school_statistics|5": { "acc": 0.5879629629629629, "acc_stderr": 0.03356787758160831, "acc_norm": 0.5879629629629629, "acc_norm_stderr": 0.03356787758160831 }, "harness|hendrycksTest-high_school_us_history|5": { "acc": 0.8725490196078431, "acc_stderr": 0.02340553048084631, "acc_norm": 0.8725490196078431, "acc_norm_stderr": 0.02340553048084631 }, "harness|hendrycksTest-high_school_world_history|5": { "acc": 0.8987341772151899, "acc_stderr": 0.019637720526065498, "acc_norm": 0.8987341772151899, "acc_norm_stderr": 0.019637720526065498 }, "harness|hendrycksTest-human_aging|5": { "acc": 0.7443946188340808, "acc_stderr": 0.029275891003969923, "acc_norm": 0.7443946188340808, "acc_norm_stderr": 0.029275891003969923 }, "harness|hendrycksTest-human_sexuality|5": { "acc": 0.8244274809160306, "acc_stderr": 0.03336820338476076, "acc_norm": 0.8244274809160306, "acc_norm_stderr": 0.03336820338476076 }, "harness|hendrycksTest-international_law|5": { "acc": 0.859504132231405, "acc_stderr": 0.03172233426002158, "acc_norm": 0.859504132231405, "acc_norm_stderr": 0.03172233426002158 }, "harness|hendrycksTest-jurisprudence|5": { "acc": 0.8425925925925926, "acc_stderr": 0.03520703990517963, "acc_norm": 0.8425925925925926, "acc_norm_stderr": 0.03520703990517963 }, "harness|hendrycksTest-logical_fallacies|5": { "acc": 0.7791411042944786, "acc_stderr": 0.03259177392742179, "acc_norm": 0.7791411042944786, "acc_norm_stderr": 0.03259177392742179 }, "harness|hendrycksTest-machine_learning|5": { "acc": 0.6071428571428571, "acc_stderr": 0.04635550135609976, "acc_norm": 0.6071428571428571, "acc_norm_stderr": 0.04635550135609976 }, "harness|hendrycksTest-management|5": { "acc": 0.8543689320388349, "acc_stderr": 0.0349260647662379, "acc_norm": 0.8543689320388349, "acc_norm_stderr": 0.0349260647662379 }, "harness|hendrycksTest-marketing|5": { "acc": 0.8846153846153846, "acc_stderr": 0.020930193185179333, "acc_norm": 0.8846153846153846, "acc_norm_stderr": 0.020930193185179333 }, "harness|hendrycksTest-medical_genetics|5": { "acc": 0.79, "acc_stderr": 0.040936018074033256, "acc_norm": 0.79, "acc_norm_stderr": 0.040936018074033256 }, "harness|hendrycksTest-miscellaneous|5": { "acc": 0.8786717752234994, "acc_stderr": 0.01167591388390672, "acc_norm": 0.8786717752234994, "acc_norm_stderr": 0.01167591388390672 }, "harness|hendrycksTest-moral_disputes|5": { "acc": 0.791907514450867, "acc_stderr": 0.021855255263421795, "acc_norm": 0.791907514450867, "acc_norm_stderr": 0.021855255263421795 }, "harness|hendrycksTest-moral_scenarios|5": { "acc": 0.48156424581005586, "acc_stderr": 0.016711130497782816, "acc_norm": 0.48156424581005586, "acc_norm_stderr": 0.016711130497782816 }, "harness|hendrycksTest-nutrition|5": { "acc": 0.7810457516339869, "acc_stderr": 0.02367908986180772, "acc_norm": 0.7810457516339869, "acc_norm_stderr": 0.02367908986180772 }, "harness|hendrycksTest-philosophy|5": { "acc": 0.8006430868167203, "acc_stderr": 0.022691033780549656, "acc_norm": 0.8006430868167203, "acc_norm_stderr": 0.022691033780549656 }, "harness|hendrycksTest-prehistory|5": { "acc": 0.8148148148148148, "acc_stderr": 0.021613809395224805, "acc_norm": 0.8148148148148148, "acc_norm_stderr": 0.021613809395224805 }, "harness|hendrycksTest-professional_accounting|5": { "acc": 0.5531914893617021, "acc_stderr": 0.029658235097666907, "acc_norm": 0.5531914893617021, "acc_norm_stderr": 0.029658235097666907 }, "harness|hendrycksTest-professional_law|5": { "acc": 0.5208604954367666, "acc_stderr": 0.01275911706651801, "acc_norm": 0.5208604954367666, "acc_norm_stderr": 0.01275911706651801 }, "harness|hendrycksTest-professional_medicine|5": { "acc": 0.7941176470588235, "acc_stderr": 0.02456220431414231, "acc_norm": 0.7941176470588235, "acc_norm_stderr": 0.02456220431414231 }, "harness|hendrycksTest-professional_psychology|5": { "acc": 0.7549019607843137, "acc_stderr": 0.017401816711427657, "acc_norm": 0.7549019607843137, "acc_norm_stderr": 0.017401816711427657 }, "harness|hendrycksTest-public_relations|5": { "acc": 0.7, "acc_stderr": 0.04389311454644287, "acc_norm": 0.7, "acc_norm_stderr": 0.04389311454644287 }, "harness|hendrycksTest-security_studies|5": { "acc": 0.7877551020408163, "acc_stderr": 0.026176967197866767, "acc_norm": 0.7877551020408163, "acc_norm_stderr": 0.026176967197866767 }, "harness|hendrycksTest-sociology|5": { "acc": 0.8656716417910447, "acc_stderr": 0.024112678240900798, "acc_norm": 0.8656716417910447, "acc_norm_stderr": 0.024112678240900798 }, "harness|hendrycksTest-us_foreign_policy|5": { "acc": 0.9, "acc_stderr": 0.030151134457776348, "acc_norm": 0.9, "acc_norm_stderr": 0.030151134457776348 }, "harness|hendrycksTest-virology|5": { "acc": 0.5120481927710844, "acc_stderr": 0.03891364495835817, "acc_norm": 0.5120481927710844, "acc_norm_stderr": 0.03891364495835817 }, "harness|hendrycksTest-world_religions|5": { "acc": 0.8654970760233918, "acc_stderr": 0.026168221344662297, "acc_norm": 0.8654970760233918, "acc_norm_stderr": 0.026168221344662297 }, "harness|truthfulqa:mc|0": { "mc1": 0.3953488372093023, "mc1_stderr": 0.017115815632418187, "mc2": 0.5638733199382533, "mc2_stderr": 0.014806158821537194 }, "harness|winogrande|5": { "acc": 0.771112865035517, "acc_stderr": 0.011807360224025397 }, "harness|gsm8k|5": { "acc": 0.0, "acc_stderr": 0.0 } } ``` ## Dataset Details ### Dataset Description <!-- Provide a longer summary of what this dataset is. --> - **Curated by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] ### Dataset Sources [optional] <!-- Provide the basic links for the dataset. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the dataset is intended to be used. --> ### Direct Use <!-- This section describes suitable use cases for the dataset. --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. --> [More Information Needed] ## Dataset Structure <!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. --> [More Information Needed] ## Dataset Creation ### Curation Rationale <!-- Motivation for the creation of this dataset. --> [More Information Needed] ### Source Data <!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). --> #### Data Collection and Processing <!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. --> [More Information Needed] #### Who are the source data producers? <!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. --> [More Information Needed] ### Annotations [optional] <!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. --> #### Annotation process <!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. --> [More Information Needed] #### Who are the annotators? <!-- This section describes the people or systems who created the annotations. --> [More Information Needed] #### Personal and Sensitive Information <!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations. ## Citation [optional] <!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Dataset Card Authors [optional] [More Information Needed] ## Dataset Card Contact [More Information Needed]
CyberHarem/sophia_granbluefantasy
--- license: mit task_categories: - text-to-image tags: - art - not-for-all-audiences size_categories: - n<1K --- # Dataset of sophia/ソフィア (Granblue Fantasy) This is the dataset of sophia/ソフィア (Granblue Fantasy), containing 104 images and their tags. The core tags of this character are `blue_hair, twintails, hat, breasts, long_hair, large_breasts, yellow_eyes, brown_eyes`, which are pruned in this dataset. Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)). ## List of Packages | Name | Images | Size | Download | Type | Description | |:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------| | raw | 104 | 107.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). | | 800 | 104 | 74.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. | | stage3-p480-800 | 223 | 140.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | | 1200 | 104 | 99.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. | | stage3-p480-1200 | 223 | 179.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/sophia_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. | ### Load Raw Dataset with Waifuc We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code ```python import os import zipfile from huggingface_hub import hf_hub_download from waifuc.source import LocalSource # download raw archive file zip_file = hf_hub_download( repo_id='CyberHarem/sophia_granbluefantasy', repo_type='dataset', filename='dataset-raw.zip', ) # extract files to your directory dataset_dir = 'dataset_dir' os.makedirs(dataset_dir, exist_ok=True) with zipfile.ZipFile(zip_file, 'r') as zf: zf.extractall(dataset_dir) # load the dataset with waifuc source = LocalSource(dataset_dir) for item in source: print(item.image, item.meta['filename'], item.meta['tags']) ``` ## List of Clusters List of tag clustering result, maybe some outfits can be mined here. ### Raw Text Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------| | 0 | 34 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, blush, fingerless_gloves, open_mouth, staff, smile, white_background, simple_background, earrings, long_sleeves | ### Table Version | # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | fingerless_gloves | open_mouth | staff | smile | white_background | simple_background | earrings | long_sleeves | |----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------------------|:-------------|:--------|:--------|:-------------------|:--------------------|:-----------|:---------------| | 0 | 34 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
tdolega/rag-tge_finetuning-dataset
--- dataset_info: features: - name: system_prompt dtype: string - name: user_prompt dtype: string - name: answer dtype: string splits: - name: train num_bytes: 14315084.532454856 num_examples: 1969 - name: test num_bytes: 581618.467545144 num_examples: 80 download_size: 7331404 dataset_size: 14896703 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* license: cc-by-sa-4.0 task_categories: - text-generation language: - en --- Dataset for finetuning LLM to generate responses with citations to source documents in RAG systems. Based on [hotpot_qa](https://huggingface.co/datasets/hotpot_qa). Generated for [rag-tge](https://github.com/tdolega/rag-tge) project.
Oedipe/modeleagg
--- license: unknown ---
autoevaluate/autoeval-eval-phpthinh__exampleem-raw-eb2c05-1728660341
--- type: predictions tags: - autotrain - evaluation datasets: - phpthinh/exampleem eval_info: task: text_zero_shot_classification model: bigscience/bloom-1b1 metrics: [] dataset_name: phpthinh/exampleem dataset_config: raw dataset_split: test col_mapping: text: text classes: classes target: target --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Zero-Shot Text Classification * Model: bigscience/bloom-1b1 * Dataset: phpthinh/exampleem * Config: raw * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@phpthinh](https://huggingface.co/phpthinh) for evaluating this model.
kristmh/test_med_vs_random
--- configs: - config_name: default data_files: - split: test_separate path: data/test_separate-* dataset_info: features: - name: text_clean dtype: string - name: labels dtype: int64 - name: class dtype: string splits: - name: test_separate num_bytes: 11937780 num_examples: 14520 download_size: 5828155 dataset_size: 11937780 --- # Dataset Card for "test_med_vs_random" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
autoevaluate/autoeval-staging-eval-project-xsum-f0ba0c18-12915724
--- type: predictions tags: - autotrain - evaluation datasets: - xsum eval_info: task: summarization model: facebook/bart-large-xsum metrics: ['bleu'] dataset_name: xsum dataset_config: default dataset_split: test col_mapping: text: document target: summary --- # Dataset Card for AutoTrain Evaluator This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset: * Task: Summarization * Model: facebook/bart-large-xsum * Dataset: xsum * Config: default * Split: test To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator). ## Contributions Thanks to [@xarymast](https://huggingface.co/xarymast) for evaluating this model.
Sumsam/Roleplay_training
--- license: mit --- This dataset comes from the GPTeacher repository, hosted on GitHub by `teknium1`. This repository features a collection of modular datasets generated by GPT-4, encompassing a variety of specific modules such as General-Instruct, Roleplay-Instruct, Code-Instruct, and Toolformer. These datasets are created with diverse prompts and structured to include instructions, inputs, and outputs, making them compatible with fine-tuning scripts similar to those used for Alpaca's dataset format. The Roleplay-Instruct dataset, for instance, includes tasks designed to assume the roles of various characters, both fictional and non-fictional, in different settings and personalities. There's also a Code-Instruct Dataset with around 5350 code task instructions in various programming languages, showcasing the versatility of the datasets in this repository.
salvame9145/customdataset2
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 16558 num_examples: 97 download_size: 6206 dataset_size: 16558 configs: - config_name: default data_files: - split: train path: data/train-* ---
FaalSa/dfaas4
--- dataset_info: features: - name: start dtype: timestamp[s] - name: target sequence: float32 - name: item_id dtype: string - name: feat_static_cat sequence: uint64 splits: - name: train num_bytes: 57633 num_examples: 1 - name: validation num_bytes: 58113 num_examples: 1 - name: test num_bytes: 58593 num_examples: 1 download_size: 35533 dataset_size: 174339 configs: - config_name: default data_files: - split: train path: data/train-* - split: validation path: data/validation-* - split: test path: data/test-* ---
cwchoi/whisper_small_c1
--- dataset_info: features: - name: input_features sequence: sequence: float32 - name: labels sequence: int64 splits: - name: train num_bytes: 46104240 num_examples: 48 - name: test num_bytes: 6723944 num_examples: 7 - name: valid num_bytes: 5762864 num_examples: 6 download_size: 9428344 dataset_size: 58591048 configs: - config_name: default data_files: - split: train path: data/train-* - split: test path: data/test-* - split: valid path: data/valid-* ---
MaxReynolds/cifar10_512x512px
--- dataset_info: features: - name: label dtype: class_label: names: '0': airplane '1': automobile '2': bird '3': cat '4': deer '5': dog '6': frog '7': horse '8': ship '9': truck - name: pixel_values dtype: image splits: - name: train num_bytes: 6445891560.0 num_examples: 50000 download_size: 6446258731 dataset_size: 6445891560.0 configs: - config_name: default data_files: - split: train path: data/train-* --- # Dataset Card for "cifar10_512x512px" [More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
1232eee/butters
--- license: unknown ---
ranbaik/customecode1
--- dataset_info: features: - name: text dtype: string splits: - name: train num_bytes: 5826 num_examples: 39 download_size: 2572 dataset_size: 5826 configs: - config_name: default data_files: - split: train path: data/train-* ---
Doutran/mylladvd
--- license: openrail ---
notfan/test-llama
--- license: apache-2.0 ---
OdiaGenAIdata/culturax-gemma-data
--- dataset_info: features: - name: text dtype: string - name: timestamp dtype: string - name: url dtype: string - name: source dtype: string splits: - name: train num_bytes: 840815079 num_examples: 153461 download_size: 321769991 dataset_size: 840815079 configs: - config_name: default data_files: - split: train path: data/train-* ---
dputilov/TTL
--- license: other ---