datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
open-llm-leaderboard/details_mlabonne__FrankenMonarch-7B | ---
pretty_name: Evaluation run of mlabonne/FrankenMonarch-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mlabonne/FrankenMonarch-7B](https://huggingface.co/mlabonne/FrankenMonarch-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mlabonne__FrankenMonarch-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-22T17:20:21.379452](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__FrankenMonarch-7B/blob/main/results_2024-03-22T17-20-21.379452.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6420333989567434,\n\
\ \"acc_stderr\": 0.03241993808434001,\n \"acc_norm\": 0.6447568500667104,\n\
\ \"acc_norm_stderr\": 0.03308047687658202,\n \"mc1\": 0.5813953488372093,\n\
\ \"mc1_stderr\": 0.017270015284476865,\n \"mc2\": 0.7368744041635789,\n\
\ \"mc2_stderr\": 0.014678925886945521\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6885665529010239,\n \"acc_stderr\": 0.013532472099850945,\n\
\ \"acc_norm\": 0.7158703071672355,\n \"acc_norm_stderr\": 0.013179442447653886\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7140011949810795,\n\
\ \"acc_stderr\": 0.004509652679395676,\n \"acc_norm\": 0.8858793069109739,\n\
\ \"acc_norm_stderr\": 0.003173079807440174\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124488,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124488\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335082,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335082\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064077,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064077\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.02485636418450322,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.02485636418450322\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489277,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489277\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973143,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973143\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.024883140570071755,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.024883140570071755\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.016251139711570772,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.016251139711570772\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757475,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757475\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153273,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153273\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49022164276401564,\n\
\ \"acc_stderr\": 0.01276779378772933,\n \"acc_norm\": 0.49022164276401564,\n\
\ \"acc_norm_stderr\": 0.01276779378772933\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.027778298701545447,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.027778298701545447\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595287,\n \
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595287\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5813953488372093,\n\
\ \"mc1_stderr\": 0.017270015284476865,\n \"mc2\": 0.7368744041635789,\n\
\ \"mc2_stderr\": 0.014678925886945521\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222795\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48673237300985595,\n \
\ \"acc_stderr\": 0.013767635127026322\n }\n}\n```"
repo_url: https://huggingface.co/mlabonne/FrankenMonarch-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-20-21.379452.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-22T17-20-21.379452.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- '**/details_harness|winogrande|5_2024-03-22T17-20-21.379452.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-22T17-20-21.379452.parquet'
- config_name: results
data_files:
- split: 2024_03_22T17_20_21.379452
path:
- results_2024-03-22T17-20-21.379452.parquet
- split: latest
path:
- results_2024-03-22T17-20-21.379452.parquet
---
# Dataset Card for Evaluation run of mlabonne/FrankenMonarch-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mlabonne/FrankenMonarch-7B](https://huggingface.co/mlabonne/FrankenMonarch-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mlabonne__FrankenMonarch-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-22T17:20:21.379452](https://huggingface.co/datasets/open-llm-leaderboard/details_mlabonne__FrankenMonarch-7B/blob/main/results_2024-03-22T17-20-21.379452.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6420333989567434,
"acc_stderr": 0.03241993808434001,
"acc_norm": 0.6447568500667104,
"acc_norm_stderr": 0.03308047687658202,
"mc1": 0.5813953488372093,
"mc1_stderr": 0.017270015284476865,
"mc2": 0.7368744041635789,
"mc2_stderr": 0.014678925886945521
},
"harness|arc:challenge|25": {
"acc": 0.6885665529010239,
"acc_stderr": 0.013532472099850945,
"acc_norm": 0.7158703071672355,
"acc_norm_stderr": 0.013179442447653886
},
"harness|hellaswag|10": {
"acc": 0.7140011949810795,
"acc_stderr": 0.004509652679395676,
"acc_norm": 0.8858793069109739,
"acc_norm_stderr": 0.003173079807440174
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124488,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124488
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335082,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335082
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.02485636418450322,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.02485636418450322
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489277,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489277
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973143,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973143
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.024883140570071755,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.024883140570071755
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570772,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570772
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757475,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757475
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153273,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153273
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49022164276401564,
"acc_stderr": 0.01276779378772933,
"acc_norm": 0.49022164276401564,
"acc_norm_stderr": 0.01276779378772933
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.027778298701545447,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.027778298701545447
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.018690850273595287,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.018690850273595287
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5813953488372093,
"mc1_stderr": 0.017270015284476865,
"mc2": 0.7368744041635789,
"mc2_stderr": 0.014678925886945521
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222795
},
"harness|gsm8k|5": {
"acc": 0.48673237300985595,
"acc_stderr": 0.013767635127026322
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CVasNLPExperiments/Hatefulmemes_test_google_flan_t5_xxl_mode_T_CM_OCR_rices_ns_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
sequence: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__text
num_bytes: 6098539
num_examples: 1000
download_size: 1145295
dataset_size: 6098539
---
# Dataset Card for "Hatefulmemes_test_google_flan_t5_xxl_mode_T_CM_OCR_rices_ns_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mickylan2367/ColorSpectrogram | ---
language:
- en
tags:
- music
- art
---
## Google/MusicCapsの音楽をスペクトログラムにしたもの
* Google/MusicCapsのスペクトログラム。カラーバージョンも作っておく.
### 基本情報
* sampling_rate: int = 44100
## 参考資料とメモ
* (memo)ぶっちゃけグレースケールもカラーバージョンをtorchvision.transformのグレースケール変換すればいいだけかも?
* ダウンロードに使ったコードは<a href="https://colab.research.google.com/drive/1HmDorbxD5g6C2WDjLierUqbhecTdRvgA?usp=sharing">こちら</a>
* 参考:https://www.kaggle.com/code/osanseviero/musiccaps-explorer
* 仕組み:Kaggleの参考コードでwavファイルをダウンロードする->スペクトログラムつくりながらmetadata.jsonlに
```
{"filename":"spectrogram_*.png", "caption":"This is beautiful music"}
```
をなどと言ったjson列を書き込み、これをアップロードした
* Huggingfaceのデータビューアが動かなくなったら、一度GoogleColabでそのデータセットをダウンロードしてみることもおすすめ
* 意外とHuggingfaceがバグっているだけかも(実話(´;ω;`)) |
hqfx/fc_sample | ---
dataset_info:
features:
- name: functions
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: function_call
struct:
- name: arguments
dtype: string
- name: name
dtype: string
- name: name
dtype: string
- name: role
dtype: string
splits:
- name: zh_easy_v1
num_bytes: 15168.989180972818
num_examples: 10
- name: zh_easy_v2
num_bytes: 55189.360492657506
num_examples: 10
- name: en_hard
num_bytes: 12585.883890024994
num_examples: 10
- name: en_react
num_bytes: 126288.2458364296
num_examples: 20
- name: zh_hard
num_bytes: 117715.8407079646
num_examples: 10
- name: zh_agent
num_bytes: 60719.32730923695
num_examples: 10
download_size: 209654
dataset_size: 387667.6474172865
configs:
- config_name: default
data_files:
- split: zh_easy_v1
path: data/zh_easy_v1-*
- split: zh_easy_v2
path: data/zh_easy_v2-*
- split: en_hard
path: data/en_hard-*
- split: en_react
path: data/en_react-*
- split: zh_hard
path: data/zh_hard-*
- split: zh_agent
path: data/zh_agent-*
---
|
4eJIoBek/Old-GIFs-22k | ---
license: unknown
---
|
jlbaker361/kaggle_males_dim_128_0.5k | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: src
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 10599618.0
num_examples: 500
download_size: 10582625
dataset_size: 10599618.0
---
# Dataset Card for "kaggle_males_dim_128_0.5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GilsonRDF/ExercisesLlama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2306.4
num_examples: 24
- name: test
num_bytes: 576.6
num_examples: 6
download_size: 4045
dataset_size: 2883.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
bot-yaya/undl_es2en_aligned | ---
dataset_info:
features:
- name: record
dtype: string
- name: clean_para_index_set_pair
dtype: string
- name: src
dtype: string
- name: dst
dtype: string
- name: src_text
dtype: string
- name: dst_text
dtype: string
- name: src_rate
dtype: float64
- name: dst_rate
dtype: float64
splits:
- name: train
num_bytes: 10706600254
num_examples: 15967431
download_size: 0
dataset_size: 10706600254
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "undl_es2en_aligned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
blacknightbV1/test | ---
license: cc-by-nd-4.0
---
|
SEACrowd/emotcmt | ---
license: mit
tags:
- emotion-classification
language:
- ind
---
# emotcmt
EmotCMT is an emotion classification Indonesian-English code-mixing dataset created through an Indonesian-English code-mixed Twitter data pipeline consisting of 4 processing steps, i.e., tokenization, language identification, lexical normalization, and translation. The dataset consists of 825 tweets, 22.736 tokens with 11.204 Indonesian tokens and 5.613 English tokens. Each tweet is labelled with an emotion, i.e., cinta (love), takut (fear), sedih (sadness), senang (joy), or marah (anger).
## Dataset Usage
Run `pip install nusacrowd` before loading the dataset through HuggingFace's `load_dataset`.
## Citation
```
@inproceedings{barik-etal-2019-normalization,
title = "Normalization of {I}ndonesian-{E}nglish Code-Mixed {T}witter Data",
author = "Barik, Anab Maulana and
Mahendra, Rahmad and
Adriani, Mirna",
booktitle = "Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019)",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/D19-5554",
doi = "10.18653/v1/D19-5554",
pages = "417--424"
}
@article{Yulianti2021NormalisationOI,
title={Normalisation of Indonesian-English Code-Mixed Text and its Effect on Emotion Classification},
author={Evi Yulianti and Ajmal Kurnia and Mirna Adriani and Yoppy Setyo Duto},
journal={International Journal of Advanced Computer Science and Applications},
year={2021}
}
```
## License
MIT
## Homepage
[https://github.com/ir-nlp-csui/emotcmt](https://github.com/ir-nlp-csui/emotcmt)
### NusaCatalogue
For easy indexing and metadata: [https://indonlp.github.io/nusa-catalogue](https://indonlp.github.io/nusa-catalogue) |
Tippawan/test2-data-semi-trainulb-r2 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: prob
sequence: float64
- name: ifpass
sequence: int64
- name: pred
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 128419205
num_examples: 44009
download_size: 25269174
dataset_size: 128419205
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Pablao0948/Austin_Mahonne | ---
license: openrail
---
|
macadeliccc/distilabel-neurology-instructions | ---
dataset_info:
features:
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 372401
num_examples: 4000
download_size: 96796
dataset_size: 372401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1 | ---
pretty_name: Evaluation run of MaziyarPanahi/Topxtral-4x7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Topxtral-4x7B-v0.1](https://huggingface.co/MaziyarPanahi/Topxtral-4x7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-31T00:17:39.711118](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1/blob/main/results_2024-03-31T00-17-39.711118.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6554293011175572,\n\
\ \"acc_stderr\": 0.03197823221757451,\n \"acc_norm\": 0.6548903178735839,\n\
\ \"acc_norm_stderr\": 0.032644868359495864,\n \"mc1\": 0.5777233782129743,\n\
\ \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7337665152055244,\n\
\ \"mc2_stderr\": 0.014429693549028136\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957007,\n\
\ \"acc_norm\": 0.7252559726962458,\n \"acc_norm_stderr\": 0.013044617212771227\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7024497112129058,\n\
\ \"acc_stderr\": 0.004562462665505233,\n \"acc_norm\": 0.8832901812387971,\n\
\ \"acc_norm_stderr\": 0.003204180072942374\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45139664804469276,\n\
\ \"acc_stderr\": 0.01664330737231587,\n \"acc_norm\": 0.45139664804469276,\n\
\ \"acc_norm_stderr\": 0.01664330737231587\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653349,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5777233782129743,\n\
\ \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7337665152055244,\n\
\ \"mc2_stderr\": 0.014429693549028136\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166737\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7172100075815011,\n \
\ \"acc_stderr\": 0.012405020417873615\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Topxtral-4x7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|arc:challenge|25_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|gsm8k|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hellaswag|10_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-31T00-17-39.711118.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- '**/details_harness|winogrande|5_2024-03-31T00-17-39.711118.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-31T00-17-39.711118.parquet'
- config_name: results
data_files:
- split: 2024_03_31T00_17_39.711118
path:
- results_2024-03-31T00-17-39.711118.parquet
- split: latest
path:
- results_2024-03-31T00-17-39.711118.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Topxtral-4x7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Topxtral-4x7B-v0.1](https://huggingface.co/MaziyarPanahi/Topxtral-4x7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-31T00:17:39.711118](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Topxtral-4x7B-v0.1/blob/main/results_2024-03-31T00-17-39.711118.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6554293011175572,
"acc_stderr": 0.03197823221757451,
"acc_norm": 0.6548903178735839,
"acc_norm_stderr": 0.032644868359495864,
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7337665152055244,
"mc2_stderr": 0.014429693549028136
},
"harness|arc:challenge|25": {
"acc": 0.6996587030716723,
"acc_stderr": 0.013395909309957007,
"acc_norm": 0.7252559726962458,
"acc_norm_stderr": 0.013044617212771227
},
"harness|hellaswag|10": {
"acc": 0.7024497112129058,
"acc_stderr": 0.004562462665505233,
"acc_norm": 0.8832901812387971,
"acc_norm_stderr": 0.003204180072942374
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45139664804469276,
"acc_stderr": 0.01664330737231587,
"acc_norm": 0.45139664804469276,
"acc_norm_stderr": 0.01664330737231587
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653349,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7337665152055244,
"mc2_stderr": 0.014429693549028136
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166737
},
"harness|gsm8k|5": {
"acc": 0.7172100075815011,
"acc_stderr": 0.012405020417873615
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
deepghs/realbooru_full | ---
license: mit
task_categories:
- image-classification
- zero-shot-image-classification
- text-to-image
language:
- en
tags:
- art
- anime
- not-for-all-audiences
size_categories:
- 100K<n<1M
annotations_creators:
- no-annotation
source_datasets:
- realbooru
---
# RealBooru Full Dataset
This is the full dataset of [realbooru.com](https://realbooru.com/). And all the original images are maintained here.
# Information
## Images/Videos
There are 784640 images/videos in total. The maximum ID of these images is 875358. Last updated at `2024-04-13 21:10:54 UTC`.
Attention: The alias system of this site is messy, so we kept the raw `tags` (original tags provided by API). **It is strongly recommended to clean these tags before training something.**
These are the information of recent 50 images:
| id | filename | width | height | type | tags | url |
|-------:|:-----------|--------:|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------|
| 875358 | 875358.jpg | 1572 | 1572 | image/jpeg | balls glasses lipstick nude oiled penis pierbi tattoo trans_female transgender two_tone_hair | https://realbooru.com/images/66/7d/667db540f38268c9d59d6eb7ec71c322.jpeg |
| 875357 | 875357.png | 1365 | 767 | image/png | brunette bwc couch exxxtra_small exxxtrasmall.com exxxtrassmall fellatio handjob_while_sucking huge_cock huge_dick huge_penis imminent_vaginal johnny_champ living_room looking_at_another nail_polish nude painted_nails petite purple_nail_polish purple_nails sam_summers small_breasts | https://realbooru.com/images/46/c3/46c3bc3da7223de9401e091feea665ec.png |
| 875355 | 875355.gif | 300 | 400 | image/gif | 1boy1girl asian_female ball_sucking bbc big_penis black_male blowjob handjob holding_penis huge_penis interracial kneeling licking_penis looking_at_another mandingo oral sucking_testicles tagme | https://realbooru.com/images/f3/53/f35337ffcec773ab4aa52cfe9c3be851.gif |
| 875354 | 875354.gif | 300 | 533 | image/gif | 1boy1girl ball_sucking bbc big_penis black_male blowjob dark-skinned_male huge_penis interracial jax_slayher kyler_quinn light-skinned_female looking_at_viewer oral penis_on_face pov sucking_testicles tagme white_female | https://realbooru.com/images/c6/9a/c69a8dd6fc62d5ee66446ae12d2f341b.gif |
| 875353 | 875353.gif | 500 | 301 | image/gif | bbc big_penis blowjob brickzilla closeup curved_penis drooling eyes_rolling_back freya_von_doom handjob huge_penis interracial just_the_tip oral porn_star saliva sloppy tagme tight_fit:_brickzilla_3 truex | https://realbooru.com/images/90/69/9069dc0c7b3669a247ecbcd92632a9f2.gif |
| 875351 | 875351.gif | 320 | 481 | image/gif | 1boy1girl bbc bent_over big_ass big_penis black_male dark-skinned_male emma_hix hollywood_cash hotdogging huge_penis interracial light-skinned_female looking_back male_pov on_all_fours porn_star pov she_loves_black tagme teasing white_female | https://realbooru.com/images/c3/7f/c37f0b3a6dea81697ed83ad56153b1f0.gif |
| 875349 | 875349.gif | 300 | 525 | image/gif | bbc big_penis blacked_raw blowjob carolina_sweets curved_penis handjob huge_penis interracial julio_gomez licking_penis looking_at_viewer oral porn_star tagme | https://realbooru.com/images/ec/49/ec4969d17f2e4847241cf5e3953690b4.gif |
| 875348 | 875348.gif | 480 | 360 | image/gif | 1boy1girl bbc big_penis black_male brickzilla curved_penis dark-skinned_male deep_penetration faceless_female huge_penis insertion interracial light-skinned_female pull_out squirting tagme vaginal valerica_steele white_female | https://realbooru.com/images/85/90/8590a7b241492c402faec32daf929875.gif |
| 875346 | 875346.gif | 540 | 408 | image/gif | 1boy1girl big_ass big_penis bwc carrying closeup don't_break_me_15 dont_break_me faceless_female holding_legs huge_ass huge_penis j-mac jmac kimmy_granger lifting_person light-skinned_female light-skinned_male mofos penetration tagme vaginal white_female white_male | https://realbooru.com/images/24/e2/24e258bf68cf6d94a9bb35f2734044d1.gif |
| 875345 | 875345.gif | 248 | 500 | image/gif | big_penis blowjob gif handjob huge_penis looking_at_penis oral sadie_west shane_diesel sucking_penis tagme thick_penis two-handed_handjob | https://realbooru.com/images/cf/16/cf1609293191e3d8556a9e47ae458ea0.gif |
| 875342 | 875342.jpg | 960 | 1280 | image/jpeg | 1girl ass ass_focus breasts brown_hair large_breasts mature_female medium_hair nude onlyfans panties pervert pervert_female sex_invitation sexually_suggestive short_hair solo solo_focus uncensored | https://realbooru.com/images/2e/b6/2eb613160e8ab16f006319bb15268e60.jpeg |
| 875340 | 875340.gif | 384 | 682 | image/gif | animated animated_gif anus asian ass black_hair breasts censored dildo erection femboy gif heels lipstick looking_at_viewer masturbation nipples penis samyyesry sitting small_breasts smile smiling testicles transgender trap | https://realbooru.com/images/b9/c3/b9c35a35777d7079fb3a5dfbf5c1f3a3.gif |
| 875339 | 875339.jpg | 2268 | 3024 | image/jpeg | anus asian ass bed black_hair butt_plug censored feet femboy fishnets flaccid legs_up looking_at_viewer penis presenting_ass samyyesry skirt smile smiling solo testicles transgender trap | https://realbooru.com/images/c4/04/c40478e625bf0031d21178e7210fa51a.jpeg |
| 875337 | 875337.jpg | 3024 | 4032 | image/jpeg | anus asian ass bed black_hair butt_plug feet femboy fishnets flaccid looking_at_viewer looking_back penis presenting_ass samyyesry skirt smile smiling solo spread_ass tattoo testicles transgender trap | https://realbooru.com/images/9d/e8/9de8fd3cf7422b5a4a777124c5d5c74a.jpeg |
| 875336 | 875336.jpg | 3024 | 4032 | image/jpeg | anus asian ass bed black_hair butt_plug eyes_closed feet femboy fishnets flaccid penis presenting_ass samyyesry skirt smile smiling solo spread_ass tattoo testicles transgender trap | https://realbooru.com/images/b4/b5/b4b5a80dd3e0dfbb1171a72b0adbd820.jpeg |
| 875326 | 875326.jpg | 1536 | 1920 | image/jpeg | 1girl ass ass_focus big_ass bikini blonde_hair bottom_heavy bubble_butt curvy dat_ass female_only from_behind huge_ass light-skinned_female looking_at_viewer looking_back mia_malkova pawg porn_star posing solo_female standing thick_thighs voluptuous wide_hips | https://realbooru.com/images/ba/46/ba469147e6506b0462c0e263883e5992.jpeg |
| 875322 | 875322.jpg | 948 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav medium_breasts nipples nude pubic_hair | https://realbooru.com/images/4c/c0/4cc0549738a316468bf73140a0809a86.jpeg |
| 875321 | 875321.jpg | 1280 | 725 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav laying_on_side medium_breasts nipples nude panties topless | https://realbooru.com/images/d9/78/d9780b5c4f03aa1540e351e3a65dafd4.jpeg |
| 875320 | 875320.jpg | 948 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav medium_breasts nipples nude pubic_hair | https://realbooru.com/images/8e/b7/8eb76f941a2178639261d02d941cee06.jpeg |
| 875319 | 875319.jpg | 1280 | 793 | image/jpeg | akiyama_syoko asian bra_lift breasts brown_hair hat japanese jav medium_breasts nipples nude straw_hat | https://realbooru.com/images/fa/82/fa827b7f9d261bf4f09c43f1bc8d28ea.jpeg |
| 875318 | 875318.jpg | 853 | 1280 | image/jpeg | akiyama_syoko asian bathtub breasts brown_hair from_above japanese jav medium_breasts nipples nude pubic_hair water | https://realbooru.com/images/b2/04/b204fa9838ea4597619060773425866f.jpeg |
| 875317 | 875317.jpg | 853 | 1280 | image/jpeg | akiyama_syoko asian bathroom breasts brown_hair japanese jav medium_breasts nipples nude pubic_hair shower showering spread_legs | https://realbooru.com/images/ab/53/ab53ef6e197361dec74f4aee5fd2f8e9.jpeg |
| 875316 | 875316.jpg | 1280 | 804 | image/jpeg | akiyama_syoko asian breasts breasts_out brown_hair japanese jav medium_breasts nipples nude shirt_down | https://realbooru.com/images/fb/ef/fbefa216ba8e0f1b2d023f84626e486b.jpeg |
| 875315 | 875315.jpg | 885 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair high_heels japanese jav medium_breasts nipples nude tagme | https://realbooru.com/images/01/22/01227db791be0e208eabb6bb46d6725b.jpeg |
| 875314 | 875314.jpg | 1280 | 793 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav kimono medium_breasts nipples nude open_clothes pubic_hair | https://realbooru.com/images/7d/fd/7dfd45c7116c32bb011e93f068af3a6e.jpeg |
| 875313 | 875313.jpg | 853 | 1220 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav kimono medium_breasts nipples nude open_clothes | https://realbooru.com/images/f2/42/f24279359c56a81ad0a32964d5156bf6.jpeg |
| 875312 | 875312.jpg | 1280 | 804 | image/jpeg | akiyama_syoko asian bottomless breasts brown_hair japanese jav medium_breasts nipples nipples_visible_through_clothing nude pubic_hair tank_top wet_shirt | https://realbooru.com/images/0d/17/0d176adfb257b4c687a3a6179ae44539.jpeg |
| 875311 | 875311.jpg | 1280 | 804 | image/jpeg | akiyama_syoko asian bottomless breasts brown_hair japanese jav medium_breasts nipples nipples_visible_through_clothing nude tanktop wet_shirt | https://realbooru.com/images/a3/da/a3da6d4f0236a520de409a7fedc41ca1.jpeg |
| 875310 | 875310.jpg | 885 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair fishnet_stockings japanese jav medium_breasts nipples nude | https://realbooru.com/images/30/42/304200f147f24e0aae6265ca43a0efd9.jpeg |
| 875309 | 875309.jpg | 853 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav medium_breasts nipples nude open_clothes | https://realbooru.com/images/24/03/240376e04990f9d2b620f4c8fb4f60fd.jpeg |
| 875308 | 875308.jpg | 948 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav medium_breasts nipples nude panties standing topless | https://realbooru.com/images/12/c4/12c4f5dcc8d54785a40e76ac521ec425.jpeg |
| 875307 | 875307.jpg | 948 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav medium_breasts nipples nude socks | https://realbooru.com/images/7b/7d/7b7def6baf80cea15f5b3553ab1db18e.jpeg |
| 875306 | 875306.jpg | 948 | 1280 | image/jpeg | akiyama_syoko asian breasts brown_hair japanese jav medium_breasts nipples nude panties_around_legs pubic_hair socks | https://realbooru.com/images/cf/09/cf0923a3880139274676139a4537adda.jpeg |
| 875305 | 875305.jpg | 1179 | 1553 | image/jpeg | 1girl black_hair coinslot_pussy cosplay demon demon_girl demon_tail horns interior kinsou_no_vermeil labia legs_up photo_(medium) pussy real_life sitting solo spread_pussy succubus tagme tail thighhighs twobrattycats vermeil vermeil_(kinsou_no_vermeil) vermeil_(kinsou_no_vermeil)_(cosplay) vermeil_in_gold | https://realbooru.com/images/af/59/af590bc8f9a4e230b92e6085605ac44b.jpeg |
| 875304 | 875304.jpg | 1080 | 1364 | image/jpeg | breasts cleft_of_venus cosplay dress green_hair highres labia medium_breasts nipples one-punch_man photo_(medium) real_life selfie side-by-side spread_legs tatsumaki tatsumaki_(cosplay) twobrattycats | https://realbooru.com/images/2c/18/2c18fa7d274e50397b48ac5e2b1465a1.jpeg |
| 875301 | 875301.jpg | 1080 | 946 | image/jpeg | breasts brown_hair cleft_of_venus clothes_lift cosplay dress dress_lift labia medium_breasts nipples orange_sweater photo_(medium) pussy real_life scooby-doo selfie side-by-side skirt spread_pussy sweater twobrattycats velma_dace_dinkley velma_dace_dinkley_(cosplay) velma_dinkley | https://realbooru.com/images/a6/90/a690508a8a9863ec50104b71c68f6e8b.jpeg |
| 875296 | 875296.jpg | 3024 | 4032 | image/jpeg | allexisbunny anus ass bent_over big_ass blonde_hair heels long_hair penis presenting_ass skirt solo testicles transgender trap | https://realbooru.com/images/d4/63/d463d0a1a65f48cbd65e8fac696fedc6.jpeg |
| 875295 | 875295.jpg | 3024 | 4032 | image/jpeg | allexisbunny anus ass bent_over big_ass blonde_hair lipstick long_hair penis presenting_ass solo testicles thicc thick transgender trap | https://realbooru.com/images/d7/0f/d70fd9e55c15463979e2be8c803799c1.jpeg |
| 875294 | 875294.jpg | 3024 | 4032 | image/jpeg | allexisbunny ass big_ass blonde_hair lipstick long_hair looking_at_viewer looking_back presenting_ass sideboob solo thicc thick transgender trap | https://realbooru.com/images/55/42/5542f93b9bbc0eaf2d886d22fcf78103.jpeg |
| 875293 | 875293.jpg | 3024 | 4032 | image/jpeg | allexisbunny ass big_ass blonde_hair lipstick long_hair looking_at_viewer looking_back presenting_ass skirt solo thicc thick transgender trap | https://realbooru.com/images/62/ca/62ca6d6ee30c55cd72448af459e6f388.jpeg |
| 875292 | 875292.jpg | 3024 | 4032 | image/jpeg | allexisbunny ass bent_over big_ass blonde_hair lipstick long_hair looking_at_viewer looking_back presenting_ass solo tattoo thicc thick thong thong_down transgender trap | https://realbooru.com/images/1e/a5/1ea5e5dcd5a76475377e0e713e17df77.jpeg |
| 875291 | 875291.jpg | 3024 | 4032 | image/jpeg | allexisbunny ass big_ass blonde_hair lipstick long_hair looking_at_viewer looking_back presenting_ass sideboob smile smiling solo thicc thick thong transgender trap | https://realbooru.com/images/d9/39/d9394c78d3133f5d01be0460dabaea0f.jpeg |
| 875289 | 875289.gif | 250 | 444 | image/gif | 1girl animated big_ass big_breasts blue_hair completely_nude cortana_blue female_only gif solo solo_female tattoo white_female | https://realbooru.com/images/49/8c/498cb6518d765ed985c082fd94da34d4.gif |
| 875288 | 875288.jpg | 1536 | 2048 | image/jpeg | 1girl beanie big_ass female_only glasses green_panties looking_at_viewer non-nude sitting solo solo_female white_female | https://realbooru.com/images/88/41/8841cb99f6bc8078aad093624c4e5502.jpeg |
| 875280 | 875280.png | 2560 | 3413 | image/png | 1girl back_tattoo big_ass female_only looking_at_viewer mandy_muse pink_bra pink_panties solo solo_female white_female | https://realbooru.com/images/22/47/2247bac9e61c46840f180f0ff88fa6db.png |
| 875279 | 875279.jpg | 810 | 1006 | image/jpeg | fully_nude holding_breasts large_breasts nipples_covered sweetieline wink | https://realbooru.com/images/77/2c/772c7ae8a9bb8f3d0d78bd13393bcf0a.jpeg |
| 875278 | 875278.jpg | 810 | 1080 | image/jpeg | cleavage cosplay large_breasts maid_uniform nipple sweetieline | https://realbooru.com/images/a6/41/a641d6d6a0e77cde36cd7cff3b31f3aa.jpeg |
| 875277 | 875277.jpg | 1080 | 1440 | image/jpeg | completely_nude labia medium_breasts nipples outdoors pussy sweetie_fox | https://realbooru.com/images/c1/68/c1682ae2c33d1a0375b7fb1f522f4b90.jpeg |
| 875276 | 875276.jpg | 702 | 702 | image/jpeg | blue_hair bulma bulma_briefs cosplay dragon_ball flat_chest nipples outdoors pubic_hair ragmig skirt small_breasts | https://realbooru.com/images/b7/ac/b7acfb70198d31acdd488021b5167d35.jpeg |
| 875275 | 875275.jpg | 2316 | 3088 | image/jpeg | 1girl ass bed big_ass blonde_hair female female_only fishnets legs_up long_hair looking_at_viewer milf onlyfans porn_star presenting_ass pussy see-through solo sophie_dee text thicc thick thong watermark | https://realbooru.com/images/94/5a/945ae31fe348197c626d5367fea372c2.jpeg |
## Tags
There are 51011 tags in total.
These are the top 30 tags of type `artist`:
| tag | type | count | ambiguous |
|:----------------|:-------|--------:|:------------|
| julia | artist | 869 | False |
| rubberella | artist | 231 | False |
| fpp | artist | 130 | True |
| charm | artist | 86 | True |
| sklfck | artist | 82 | False |
| jj.am | artist | 72 | False |
| 4gifs | artist | 56 | False |
| lcfakeword | artist | 53 | False |
| nyaneko | artist | 50 | False |
| kinkymarie | artist | 37 | False |
| vargas_fakes | artist | 32 | False |
| klixen | artist | 30 | False |
| zennsfw | artist | 27 | False |
| fake_nation | artist | 24 | False |
| demond4n | artist | 20 | False |
| gifporntube.com | artist | 19 | False |
| nurunuru | artist | 19 | False |
| vandych | artist | 18 | False |
| pr0ncave | artist | 16 | False |
| bbwgothcumsex | artist | 15 | False |
| nero | artist | 15 | False |
| nylon | artist | 15 | False |
| rickoliver1969 | artist | 15 | False |
| used | artist | 15 | False |
| alyssa_at_night | artist | 14 | False |
| luigi2k16 | artist | 14 | False |
| nsfwgifer | artist | 14 | False |
| hitachi | artist | 13 | False |
| 171gifs | artist | 12 | False |
| celebfakee | artist | 12 | False |
These are the top 30 tags of type `character`:
| tag | type | count | ambiguous |
|:---------------------|:----------|--------:|:------------|
| asuka_langley_sohryu | character | 583 | False |
| cammy_white | character | 463 | False |
| yorha_2b | character | 328 | False |
| chun-li | character | 321 | False |
| velma_dinkley | character | 309 | False |
| yoko_littner | character | 309 | False |
| lara_croft | character | 301 | False |
| harley_quinn | character | 292 | False |
| wonder_woman | character | 237 | False |
| spider-man | character | 234 | False |
| kashiwazaki_sena | character | 216 | False |
| usagi | character | 209 | False |
| morrigan_aensland | character | 194 | False |
| rei_ayanami | character | 185 | False |
| yor_forger | character | 177 | False |
| mai_shiranui | character | 176 | False |
| kitagawa_marin | character | 166 | False |
| misty_(pokemon) | character | 164 | False |
| d.va | character | 154 | False |
| 2b | character | 152 | False |
| tifa_lockhart | character | 146 | False |
| tomoe | character | 145 | False |
| tatsumaki | character | 136 | False |
| batman | character | 135 | False |
| mavis_dracula | character | 132 | False |
| samus_aran | character | 128 | False |
| makima | character | 124 | False |
| snow_white | character | 122 | False |
| supergirl | character | 120 | False |
| hatsune_miku | character | 118 | False |
These are the top 30 tags of type `copyright`:
| tag | type | count | ambiguous |
|:------------------------|:----------|--------:|:------------|
| rubberdoll | copyright | 27947 | False |
| onlyfans | copyright | 6286 | False |
| brazzers | copyright | 5070 | False |
| bangbros | copyright | 2767 | False |
| jav | copyright | 2480 | False |
| instagram | copyright | 2337 | False |
| blacked | copyright | 1799 | False |
| ddf | copyright | 1367 | False |
| dc | copyright | 1292 | False |
| pornhub | copyright | 1287 | False |
| ftv_girls | copyright | 1210 | False |
| evil_angel | copyright | 1206 | False |
| realitykings | copyright | 1097 | False |
| twitter | copyright | 1055 | False |
| naughty_america | copyright | 1053 | False |
| marvel | copyright | 886 | False |
| neon_genesis_evangelion | copyright | 851 | False |
| cosplaydeviants | copyright | 845 | False |
| playboy | copyright | 833 | False |
| street_fighter | copyright | 819 | False |
| blacked_raw | copyright | 801 | False |
| shemale_japan | copyright | 690 | False |
| scoreland | copyright | 670 | False |
| reddit | copyright | 663 | False |
| xvideos.com | copyright | 660 | False |
| jules_jordan | copyright | 624 | False |
| score_group | copyright | 561 | False |
| tiktok | copyright | 552 | False |
| aziani_(copyright) | copyright | 513 | False |
| cum4k.com | copyright | 508 | False |
These are the top 30 tags of type `general`:
| tag | type | count | ambiguous |
|:------------------|:--------|--------:|:------------|
| long_hair | general | 644784 | False |
| breasts | general | 612299 | True |
| solo | general | 519530 | False |
| female | general | 508808 | False |
| large_breasts | general | 451532 | False |
| latex | general | 133587 | False |
| high_heels | general | 132197 | False |
| asian | general | 122905 | False |
| shoes | general | 122071 | False |
| shemale | general | 107707 | False |
| ass | general | 104543 | False |
| 1girl | general | 80706 | False |
| nipples | general | 66001 | False |
| navel | general | 55174 | False |
| nude | general | 50073 | False |
| tattoo | general | 45474 | False |
| looking_back | general | 45024 | False |
| penis | general | 44215 | False |
| bed | general | 43285 | False |
| pussy | general | 36534 | False |
| thighhighs | general | 36264 | False |
| big_ass | general | 35291 | True |
| female_only | general | 35262 | False |
| outside | general | 33249 | False |
| cleavage | general | 33139 | False |
| looking_at_viewer | general | 32561 | True |
| piercing | general | 32450 | False |
| earrings | general | 31571 | False |
| erect_nipples | general | 31294 | False |
| 1boy | general | 31104 | False |
These are the top 30 tags of type `metadata`:
| tag | type | count | ambiguous |
|:-----------------|:---------|--------:|:------------|
| watermark | metadata | 414557 | False |
| webm | metadata | 82990 | False |
| tagme | metadata | 59689 | False |
| cosplay | metadata | 58590 | False |
| porn_star | metadata | 30335 | False |
| photo | metadata | 30215 | False |
| sound | metadata | 21615 | False |
| pornstar | metadata | 8714 | False |
| video | metadata | 7895 | False |
| fakes | metadata | 7692 | False |
| censored | metadata | 6592 | False |
| sourced | metadata | 6419 | False |
| amateur | metadata | 5626 | False |
| no_sound | metadata | 5591 | False |
| model | metadata | 5530 | False |
| highres | metadata | 4786 | False |
| uncensored | metadata | 4602 | False |
| vertical_video | metadata | 3674 | False |
| source_request | metadata | 3099 | False |
| slut | metadata | 2621 | False |
| beautiful | metadata | 1735 | False |
| lowres | metadata | 1547 | False |
| music | metadata | 1330 | False |
| monochrome | metadata | 1319 | False |
| pornstars | metadata | 1280 | False |
| video_with_sound | metadata | 1234 | False |
| real | metadata | 1093 | False |
| fake | metadata | 936 | True |
| whore | metadata | 916 | False |
| sfw | metadata | 908 | False |
These are the top 30 tags of type `model`:
| tag | type | count | ambiguous |
|:-------------------------|:-------|--------:|:------------|
| bianca_beauchamp | model | 159513 | False |
| susan_wayland | model | 45623 | False |
| bailey_jay | model | 34880 | False |
| sarina_valentina | model | 16612 | False |
| ashley_george | model | 14747 | False |
| chouzuki_maryou | model | 14178 | False |
| shooting_star | model | 13947 | False |
| gianna_michaels | model | 10374 | False |
| lenfried | model | 8168 | False |
| nonsummerjack | model | 6762 | False |
| nicole_graves | model | 5444 | False |
| candy_charms | model | 4934 | False |
| hitomi_tanaka | model | 4856 | False |
| sugihara_anri | model | 4425 | False |
| rikki_six | model | 4180 | False |
| kimber_james | model | 3813 | False |
| madison_ivy | model | 3152 | False |
| emma_butt | model | 2920 | False |
| foxxy | model | 2906 | False |
| miran | model | 2655 | False |
| jaime_hammer | model | 2648 | False |
| ashiya_noriko | model | 2625 | False |
| maserati | model | 2567 | False |
| asami_yuma | model | 2307 | False |
| vanessa | model | 2158 | False |
| marie-claude_bourbonnais | model | 2034 | False |
| enako | model | 2013 | False |
| kelly_clare | model | 1941 | False |
| amy_anderssen | model | 1863 | False |
| emily_addison | model | 1652 | False |
These are the top 30 tags of type `unknown`:
| tag | type | count | ambiguous |
|:---------------------|:--------|--------:|:------------|
| jensen_ackles | unknown | 310 | False |
| dean_winchester | unknown | 294 | False |
| ariana_grande | unknown | 206 | False |
| 1_human | unknown | 204 | False |
| princess_leia_organa | unknown | 175 | False |
| video_games | unknown | 155 | False |
| multiple_males | unknown | 149 | False |
| carrie_fisher | unknown | 143 | False |
| high_res | unknown | 131 | False |
| miranda_cosgrove | unknown | 125 | False |
| angelo_mysterioso | unknown | 99 | False |
| star_trek_voyager | unknown | 95 | False |
| jared_padalecki | unknown | 87 | False |
| ghetto_gaggers | unknown | 80 | False |
| dangling_testicles | unknown | 79 | False |
| 3d | unknown | 78 | False |
| humanoid | unknown | 77 | False |
| sam_winchester | unknown | 75 | False |
| games | unknown | 74 | False |
| titty_fuck | unknown | 71 | False |
| tittyfucking | unknown | 67 | False |
| xnalara | unknown | 66 | False |
| xps | unknown | 66 | False |
| incipient_kiss | unknown | 63 | False |
| amanda_tapping | unknown | 62 | False |
| kaley_cuoco | unknown | 57 | False |
| friends | unknown | 55 | False |
| mass_effect | unknown | 55 | False |
| beige_skin | unknown | 54 | False |
| natalie_portman | unknown | 53 | False |
|
soulfree89/llama2kor-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1171
num_examples: 6
download_size: 2870
dataset_size: 1171
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bigscience-data/roots_indic-te_wiktionary | ---
language: te
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
|
aabisec/HAdata | ---
license: mit
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_v5-mathemak-0d489a-2053267102 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_v5
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-13b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_v5
dataset_config: mathemakitten--winobias_antistereotype_test_v5
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-13b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_v5
* Config: mathemakitten--winobias_antistereotype_test_v5
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
luvres/Bible-ACF-portuguese | ---
dataset_info:
features:
- name: book
dtype: string
- name: chapter
dtype: string
- name: verse
dtype: string
- name: text
dtype: string
- name: testament
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 5514873
num_examples: 29631
download_size: 2483576
dataset_size: 5514873
---
# Dataset Card for "Bible-ACF-portuguese"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_AA051610__A0105 | ---
pretty_name: Evaluation run of AA051610/A0105
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [AA051610/A0105](https://huggingface.co/AA051610/A0105) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A0105\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-06T02:57:13.678426](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0105/blob/main/results_2024-01-06T02-57-13.678426.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6820912387442206,\n\
\ \"acc_stderr\": 0.031065747816855848,\n \"acc_norm\": 0.6864043317313094,\n\
\ \"acc_norm_stderr\": 0.031666952183494475,\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.0170287073012452,\n \"mc2\": 0.5543558949846231,\n\
\ \"mc2_stderr\": 0.016036294123592646\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414044,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6291575383389763,\n\
\ \"acc_stderr\": 0.004820431839600027,\n \"acc_norm\": 0.8254331806413066,\n\
\ \"acc_norm_stderr\": 0.003788203729346702\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.04024778401977108,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.04024778401977108\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7960526315789473,\n \"acc_stderr\": 0.03279000406310049,\n\
\ \"acc_norm\": 0.7960526315789473,\n \"acc_norm_stderr\": 0.03279000406310049\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.02749566368372405,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.02749566368372405\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.0365634365335316,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.0365634365335316\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.030135906478517563,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.030135906478517563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5555555555555556,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8225806451612904,\n \"acc_stderr\": 0.02173254068932928,\n \"\
acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.02173254068932928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592174,\n \"\
acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592174\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603617,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603617\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7461538461538462,\n \"acc_stderr\": 0.022066054378726253,\n\
\ \"acc_norm\": 0.7461538461538462,\n \"acc_norm_stderr\": 0.022066054378726253\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361255,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361255\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.41721854304635764,\n \"acc_stderr\": 0.040261414976346104,\n \"\
acc_norm\": 0.41721854304635764,\n \"acc_norm_stderr\": 0.040261414976346104\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8605504587155963,\n \"acc_stderr\": 0.014852421490033055,\n \"\
acc_norm\": 0.8605504587155963,\n \"acc_norm_stderr\": 0.014852421490033055\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8823529411764706,\n \"acc_stderr\": 0.022613286601132012,\n \"\
acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.022613286601132012\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.869198312236287,\n \"acc_stderr\": 0.021948766059470767,\n \
\ \"acc_norm\": 0.869198312236287,\n \"acc_norm_stderr\": 0.021948766059470767\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\
\ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\
\ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.035817969517092825,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.035817969517092825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.5714285714285714,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n\
\ \"acc_stderr\": 0.011832954239305742,\n \"acc_norm\": 0.8748403575989783,\n\
\ \"acc_norm_stderr\": 0.011832954239305742\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.021742519835276277,\n\
\ \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.021742519835276277\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3754189944134078,\n\
\ \"acc_stderr\": 0.01619510424846353,\n \"acc_norm\": 0.3754189944134078,\n\
\ \"acc_norm_stderr\": 0.01619510424846353\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n\
\ \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n\
\ \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n\
\ \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904212,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904212\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.524822695035461,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5182529335071708,\n\
\ \"acc_stderr\": 0.012761723960595474,\n \"acc_norm\": 0.5182529335071708,\n\
\ \"acc_norm_stderr\": 0.012761723960595474\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7279411764705882,\n \"acc_stderr\": 0.027033041151681456,\n\
\ \"acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.027033041151681456\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7124183006535948,\n \"acc_stderr\": 0.018311653053648222,\n \
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.018311653053648222\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n\
\ \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n\
\ \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3843329253365973,\n\
\ \"mc1_stderr\": 0.0170287073012452,\n \"mc2\": 0.5543558949846231,\n\
\ \"mc2_stderr\": 0.016036294123592646\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5466262319939348,\n \
\ \"acc_stderr\": 0.013712471049515439\n }\n}\n```"
repo_url: https://huggingface.co/AA051610/A0105
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|arc:challenge|25_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|gsm8k|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hellaswag|10_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-57-13.678426.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-06T02-57-13.678426.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- '**/details_harness|winogrande|5_2024-01-06T02-57-13.678426.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-06T02-57-13.678426.parquet'
- config_name: results
data_files:
- split: 2024_01_06T02_57_13.678426
path:
- results_2024-01-06T02-57-13.678426.parquet
- split: latest
path:
- results_2024-01-06T02-57-13.678426.parquet
---
# Dataset Card for Evaluation run of AA051610/A0105
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/A0105](https://huggingface.co/AA051610/A0105) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__A0105",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-06T02:57:13.678426](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A0105/blob/main/results_2024-01-06T02-57-13.678426.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6820912387442206,
"acc_stderr": 0.031065747816855848,
"acc_norm": 0.6864043317313094,
"acc_norm_stderr": 0.031666952183494475,
"mc1": 0.3843329253365973,
"mc1_stderr": 0.0170287073012452,
"mc2": 0.5543558949846231,
"mc2_stderr": 0.016036294123592646
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414044,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6291575383389763,
"acc_stderr": 0.004820431839600027,
"acc_norm": 0.8254331806413066,
"acc_norm_stderr": 0.003788203729346702
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.04024778401977108,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.04024778401977108
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7960526315789473,
"acc_stderr": 0.03279000406310049,
"acc_norm": 0.7960526315789473,
"acc_norm_stderr": 0.03279000406310049
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.02749566368372405,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.02749566368372405
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.03396116205845335,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.03396116205845335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.0365634365335316,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.0365634365335316
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.030135906478517563,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.030135906478517563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.02173254068932928,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.02173254068932928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592174,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592174
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603617,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603617
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7461538461538462,
"acc_stderr": 0.022066054378726253,
"acc_norm": 0.7461538461538462,
"acc_norm_stderr": 0.022066054378726253
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.41721854304635764,
"acc_stderr": 0.040261414976346104,
"acc_norm": 0.41721854304635764,
"acc_norm_stderr": 0.040261414976346104
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8605504587155963,
"acc_stderr": 0.014852421490033055,
"acc_norm": 0.8605504587155963,
"acc_norm_stderr": 0.014852421490033055
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.022613286601132012,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.022613286601132012
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.021948766059470767,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.021948766059470767
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.035817969517092825,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.035817969517092825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786746,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786746
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.011832954239305742,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.011832954239305742
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.021742519835276277,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.021742519835276277
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3754189944134078,
"acc_stderr": 0.01619510424846353,
"acc_norm": 0.3754189944134078,
"acc_norm_stderr": 0.01619510424846353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5182529335071708,
"acc_stderr": 0.012761723960595474,
"acc_norm": 0.5182529335071708,
"acc_norm_stderr": 0.012761723960595474
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.027033041151681456,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.027033041151681456
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.018311653053648222,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.018311653053648222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3843329253365973,
"mc1_stderr": 0.0170287073012452,
"mc2": 0.5543558949846231,
"mc2_stderr": 0.016036294123592646
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
},
"harness|gsm8k|5": {
"acc": 0.5466262319939348,
"acc_stderr": 0.013712471049515439
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Lo/adapt-pre-trained-VL-models-to-text-data-Wikipedia-finetune | ---
language:
- en
license:
- cc-by-sa-3.0
multilinguality:
- monolingual
---
The Wikipedia finetune data used to train visual features for the adaption of vision-and-language models to text-only tasks in the paper "How to Adapt Pre-trained Vision-and-Language Models to a Text-only Input?".
The data has been created from the "20200501.en" revision of the [wikipedia dataset](https://huggingface.co/datasets/wikipedia) on Huggingface.
|
el2e10/aya-paraphrase-marathi | ---
language:
- mr
license: cc
size_categories:
- n<1K
source_datasets:
- extended|ai4bharat/IndicXParaphrase
task_categories:
- text-generation
pretty_name: Aya Paraphrase Marathi
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: template_lang
dtype: string
- name: template_id
dtype: int64
splits:
- name: train
num_bytes: 683937
num_examples: 1001
download_size: 245473
dataset_size: 683937
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
### Description
This dataset is derived from the already existing dataset made by AI4Bharat. We have used the [IndicXParaphrase](https://huggingface.co/datasets/ai4bharat/IndicXParaphrase) dataset of AI4Bharat to create this instruction style dataset.
We have used the malayalam split of the above mentioned dataset to create this one. This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
IndicXParaphrase is multilingual, and n-way parallel dataset for paraphrase detection in 10 Indic languages. The original dataset(IndicXParaphrase) was made available under the cc-0 license.
### Template
The following templates(Marathi) where used for converting the original dataset:
```
#Template 1
prompt:
खालील वाक्य दुसरे-भिन्न शब्द वापरून लिहा: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 2
prompt:
खालील वाक्य वेगळ्या प्रकारे पुन्हा लिहा: "{original_sentence}"
completion:
{paraphrased_sentence}
```
```
#Template 3
prompt:
खालील वाक्य दुसरे शब्द वापरून रूपांतरित-अनुवादित करा: "{original_sentence}"
completion:
{paraphrased_sentence}
```
### Acknowledgement
Thank you, Yogesh Haribhau Kulkarni for helping with the preparation of this dataset by providing the Marathi translation of the above mentioned English prompts. |
daven3/geobench | ---
license: apache-2.0
task_categories:
- multiple-choice
- question-answering
size_categories:
- 1K<n<10K
---
# Benchmark: GeoBenchmark
In GeoBenchmark, we collect 183 multiple-choice questions in NPEE, and 1,395 in AP Test, for objective tasks.
Meanwhile, we gather all 939 subjective questions in NPEE to be the subjective tasks set and use 50 to measure the baselines with human evaluation.
|
RTVS/SpotifyLyrics001 | ---
license: cc0-1.0
task_categories:
- text-generation
language:
- en
tags:
- art
pretty_name: Spotify Lyrics From Kaggle dataset
--- |
NPCProgrammer/ALBERT_tweet_tuned | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': non_irony
'1': irony
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 9085595
num_examples: 2862
- name: test
num_bytes: 2493753
num_examples: 784
- name: validation
num_bytes: 3031237
num_examples: 955
download_size: 580596
dataset_size: 14610585
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
hvvvque2/minhavozretreinada | ---
license: openrail
---
|
Brizape/SETH_ibo | ---
dataset_info:
features:
- name: id
dtype: string
- name: token
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 1585605.8988095238
num_examples: 403
- name: validation
num_bytes: 397385.1011904762
num_examples: 101
- name: test
num_bytes: 473869
num_examples: 126
download_size: 405462
dataset_size: 2456860.0
---
# Dataset Card for "SETH_ibo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_minival_no_image_google_flan_t5_xl_mode_D_PNP_FILTER_C_Q_rices_ns_25994 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_random_
num_bytes: 3684225
num_examples: 25994
download_size: 1310320
dataset_size: 3684225
---
# Dataset Card for "VQAv2_minival_no_image_google_flan_t5_xl_mode_D_PNP_FILTER_C_Q_rices_ns_25994"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yavasde/lemmatized-wikitext2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2652445
num_examples: 23767
- name: test
num_bytes: 313242
num_examples: 2891
- name: valid
num_bytes: 284363
num_examples: 2461
download_size: 1949711
dataset_size: 3250050
---
# Dataset Card for "lemmatized-wikitext"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jilp00/youtoks-transcripts-Intro-Psychology | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1360676
num_examples: 1583
download_size: 757845
dataset_size: 1360676
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JovialValley/broadclass_totalMapped2 | ---
dataset_info:
features:
- name: input_values
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 109265512
num_examples: 390
- name: test
num_bytes: 27156588
num_examples: 97
download_size: 137259978
dataset_size: 136422100
---
# Dataset Card for "broadclass_totalMapped2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sadiakanwal/openassistant_guanco_customized | ---
language:
- en
tags:
- question
--- |
pritamdeka/dataset_dnrti_train | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3795169
num_examples: 5250
download_size: 1090344
dataset_size: 3795169
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distilled-one-sec-cv12-each-chunk-uniq/chunk_6 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1143722652.0
num_examples: 222861
download_size: 1166386713
dataset_size: 1143722652.0
---
# Dataset Card for "chunk_6"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jax-diffusers-event/canny_diffusiondb | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: prompt
dtype: string
- name: transformed_image
dtype: image
splits:
- name: train
num_bytes: 604990210.0
num_examples: 994
download_size: 604849707
dataset_size: 604990210.0
---
# Canny DiffusionDB
This dataset is the [DiffusionDB dataset](https://huggingface.co/datasets/poloclub/diffusiondb) that is transformed using Canny transformation.
You can see samples below 👇
**Sample:**
Original Image:

Transformed Image:

Caption:
"a small wheat field beside a forest, studio lighting, golden ratio, details, masterpiece, fine art, intricate, decadent, ornate, highly detailed, digital painting, octane render, ray tracing reflections, 8 k, featured, by claude monet and vincent van gogh "
Below you can find a small script used to create this dataset:
```python
def canny_convert(image):
image_array = np.array(image)
gray_image = cv2.cvtColor(image_array, cv2.COLOR_BGR2GRAY)
edges = cv2.Canny(gray_image, 100, 200)
edge_image = Image.fromarray(edges)
return edge_image
dataset = load_dataset("poloclub/diffusiondb", split = "train")
dataset_list = []
for data in dataset:
image_path = data["image"]
prompt = data["prompt"]
transformed_image_path = canny_convert(image_path)
new_data = {
"original_image": image,
"prompt": prompt,
"transformed_image": transformed_image,
}
dataset_list.append(new_data)
``` |
Andyrasika/awesome_prompts | ---
dataset_info:
features:
- name: act
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 74581
num_examples: 153
download_size: 45077
dataset_size: 74581
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Tverous/anli-amr | ---
dataset_info:
features:
- name: uid
dtype: string
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: reason
dtype: string
- name: claim_cleaned_amr
dtype: string
- name: amr_penman
dtype: string
- name: amr_tokens
sequence: string
- name: amr_nodes
dtype: string
- name: amr_alignments
dtype: string
- name: amr_edges
sequence:
sequence: string
splits:
- name: train
num_bytes: 146374351
num_examples: 100459
- name: dev
num_bytes: 1919899
num_examples: 1200
- name: test
num_bytes: 1907283
num_examples: 1200
download_size: 44471917
dataset_size: 150201533
---
# Dataset Card for "anli-amr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
subset-data/finetune-data-792a4a928579 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 439213.3333333333
num_examples: 56
- name: test
num_bytes: 31372.380952380954
num_examples: 4
- name: valid
num_bytes: 23529.285714285714
num_examples: 3
download_size: 147649
dataset_size: 494115.0
---
# Dataset Card for "finetune-data-792a4a928579"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ktrinh38/aritzia | ---
dataset_info:
features:
- name: folder
dtype: string
- name: path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 737220045.968
num_examples: 8182
download_size: 707777648
dataset_size: 737220045.968
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zambezivoice/zambezivoice_loz_aug_text | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 582481
num_examples: 10397
download_size: 345760
dataset_size: 582481
---
# Dataset Card for "zambezivoice_loz_aug_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SUSTech/bagel-dpo | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: source
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 655389313
num_examples: 192036
download_size: 369012655
dataset_size: 655389313
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jacksprat/TCGA_N_staging | ---
license: unknown
---
|
PKU-Alignment/PKU-SafeRLHF | ---
license: cc-by-nc-4.0
task_categories:
- text-generation
language:
- en
tags:
- safe
- safety
- ai-safety
- llm
- lm
- human-feedback
- rlhf
- safe-rlhf
size_categories:
- 100K<n<1M
---
# Dataset Card for PKU-SafeRLHF
<span style="color: red;">Warning: this dataset contains data that may be offensive or harmful. The data are intended for research purposes, especially research that can make models less harmful. The views expressed in the data do not reflect the views of PKU-Alignment Team or any of its members. </span>
## Dataset Summary
The preference dataset consists of 30k+ expert comparison data. Each entry in this dataset includes two responses to a question, along with safety meta-labels and preferences for both responses, taking into consideration their helpfulness and harmlessness.
The annotation pipeline for this dataset is depicted in the following image:

## Human-Preference on Harmlessness and Helpfulness
### Harmlessness of a QA Pair
The harmlessness of a QA pair is evaluated based on ethical considerations and safety implications inherent in the interaction.
A QA pair is deemed harmless when it is classified as risk-neutral across all 14 harm categories outlined in our paper.
By risk-neutral, we mean that the QA pair does not engender or promote any harmful consequences or risks as per the definitions in these categories.
Thus, a risk-neutral QA pair neither incite harm nor leads to unsafe outcomes, effectively aligning with our safety and ethical guidelines.
### Helpfulness of a Response
The helpfulness of a response pertains to how effectively it addresses a given prompt. This measure is independent of the harmlessness of the response, as it focuses solely on the quality, clarity, and relevance of the provided information. Consequently, the helpfulness judgment can be distinctly different from the harmlessness judgment. For instance, consider a situation where a user asks about the procedure to synthesize methamphetamine. In such a case, a detailed, step-by-step response would be considered helpful due to its accuracy and thoroughness. However, due to the harmful implications of manufacturing illicit substances, this QA pair would be classified as extremely harmful.
### Ranking of Responses
Once the helpfulness and harmlessness of responses are evaluated, they are ranked accordingly. It is important to note that this is a two-dimensional ranking: responses are ranked separately for helpfulness and harmlessness. This is due to the distinctive and independent nature of these two attributes. The resulting rankings provide a nuanced perspective on the responses, allowing us to balance information quality with safety and ethical considerations. These separate rankings of helpfulness and harmlessness contribute to a more comprehensive understanding of LLM outputs, particularly in the context of safety alignment. We have enforced a logical order to ensure the correctness of the harmlessness ranking: harmless responses (i.e. all 14 harm categories risk-neutral) are always ranked higher than harmful ones (i.e., at least 1 category risky).
## Usage
To load our dataset, use the `load_dataset()` function as follows:
```python
from datasets import load_dataset
dataset = load_dataset("PKU-Alignment/PKU-SafeRLHF")
```
## Paper
You can find more information in our paper
- **Dataset Paper:** <https://arxiv.org/abs/2307.04657>
## Contact
The original authors host this dataset on GitHub here: https://github.com/PKU-Alignment/beavertails.
|
BangumiBase/summertimerender | ---
license: mit
tags:
- art
size_categories:
- 1K<n<10K
---
# Bangumi Image Base of Summertime Render
This is the image base of bangumi Summertime Render, we detected 32 characters, 2981 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 372 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 55 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 33 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 230 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 48 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 732 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 66 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 88 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 68 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 73 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 288 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 20 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 14 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 64 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 164 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 50 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 19 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 19 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 21 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 30 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 46 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 13 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 8 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 9 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 13 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 43 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 99 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 7 | [Download](27/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 28 | 13 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 45 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 7 | [Download](30/dataset.zip) |  |  |  |  |  |  |  | N/A |
| noise | 224 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
CAiRE/prosocial-dialog-zho_Hans | ---
dataset_info:
features:
- name: context
dtype: string
- name: response
dtype: string
- name: rots
sequence: string
- name: safety_label
dtype: string
- name: safety_annotations
sequence: string
- name: safety_annotation_reasons
sequence: string
- name: source
dtype: string
- name: etc
dtype: string
- name: dialogue_id
dtype: int64
- name: response_id
dtype: int64
- name: episode_done
dtype: bool
- name: mt_context
dtype: string
splits:
- name: train
num_bytes: 75401741
num_examples: 120236
- name: validation
num_bytes: 12805152
num_examples: 20416
- name: test
num_bytes: 15658595
num_examples: 25029
download_size: 48108599
dataset_size: 103865488
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
gsstein/results-opt | ---
dataset_info:
features:
- name: id
dtype: string
- name: base_100
dtype: string
- name: opt_100
dtype: string
- name: generated_opt_100
dtype: bool
- name: opt_75
dtype: string
- name: generated_opt_75
dtype: bool
- name: opt_50
dtype: string
- name: generated_opt_50
dtype: bool
- name: opt_25
dtype: string
- name: generated_opt_25
dtype: bool
- name: opt_0
dtype: string
- name: generated_opt_0
dtype: bool
splits:
- name: train
num_bytes: 10028243
num_examples: 15326
- name: test
num_bytes: 376776
num_examples: 576
- name: validation
num_bytes: 373374
num_examples: 576
download_size: 7103433
dataset_size: 10778393
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
DanielDimas/stallone | ---
license: openrail
---
|
mazkooleg/digit_mask_ft_ensemble_distilled_mfcc | ---
dataset_info:
features:
- name: label
dtype: bool
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: test
num_bytes: 30990673
num_examples: 6086
- name: validation
num_bytes: 26866052
num_examples: 5276
- name: train
num_bytes: 9297201825
num_examples: 1825800
download_size: 9560584187
dataset_size: 9355058550
---
# Dataset Card for "digit_mask_ft_ensemble_distilled_mfcc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonaschris2103/CBLANE | ---
license: mit
---
|
Nexdata/2608_Videos_Before_And_After_Weight_Loss_Comparison_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
2,608 Videos – Before And After Weight Loss Comparison Data includes indoor scenes and outdoor scenes. The data covers multiple scenes and multiple resolutions. The data can be used for tasks such as human behavior detection and comparison before and after weight loss.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1366?source=Huggingface
## Data size
2,608 videos
## Collecting environment
including indoor and outdoor scenes
## Data diversity
multiple scenes, multiple shooting angles, multiple resolution
## Collecting time
day, night
## Collecting equipment
cell phone
## Data format
the video data format is .mp4
# Licensing Information
Commercial License
|
Nadav/pixel_squad_cannon | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
array2_d:
shape:
- 23
- 23
dtype: uint8
splits:
- name: train
num_bytes: 7614068486.344
num_examples: 222844
- name: test
num_bytes: 410519961.528
num_examples: 11873
download_size: 7881628043
dataset_size: 8024588447.872
---
# Dataset Card for "pixel_squad_cannon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-project-07c07057-797e-4d34-8fcb-023957860774-7467 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: autoevaluate/natural-language-inference
metrics: []
dataset_name: glue
dataset_config: mrpc
dataset_split: validation
col_mapping:
text1: sentence1
text2: sentence2
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: autoevaluate/natural-language-inference
* Dataset: glue
* Config: mrpc
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
MingLiiii/Alpaca_Analysis_llama2_13b | ---
dataset_info:
features:
- name: data
struct:
- name: loss
sequence: float64
- name: ppl
sequence: float64
splits:
- name: origin
num_bytes: 3755354
num_examples: 52002
- name: reflect_instruction
num_bytes: 3757082
num_examples: 52002
- name: reflect_response
num_bytes: 3744144
num_examples: 52002
- name: reflect_both
num_bytes: 3744144
num_examples: 52002
download_size: 12546147
dataset_size: 15000724
---
# Dataset Card for "Alpaca_Analysis_llama2_13b"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-one-sec-cv12-each-chunk-uniq/chunk_73 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1319093356.0
num_examples: 257033
download_size: 1350528354
dataset_size: 1319093356.0
---
# Dataset Card for "chunk_73"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GoodCookie/Timeless-Bunni | ---
license: afl-3.0
---
|
NickyNicky/med-qa-en-4options-source_filter | ---
dataset_info:
features:
- name: description
dtype: string
- name: question
dtype: string
- name: options
list:
- name: key
dtype: string
- name: value
dtype: string
- name: answer
struct:
- name: key
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 9331312
num_examples: 10178
download_size: 5059128
dataset_size: 9331312
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PaddlePaddle/duconv | ---
license: apache-2.0
---
|
thercyl/V | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 56161821
num_examples: 1614
download_size: 34407640
dataset_size: 56161821
---
# Dataset Card for "V"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kossnocorp/wikipedia-words-ru-low | ---
dataset_info:
features:
- name: word
dtype: string
- name: pos
dtype: string
- name: count
dtype: int64
- name: frequency
dtype: float64
splits:
- name: train
num_bytes: 148701467.08883896
num_examples: 3386210
download_size: 52455403
dataset_size: 148701467.08883896
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
photonmz/roco-instruct-65k | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: image
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 18403899
num_examples: 65422
- name: validation
num_bytes: 2289458
num_examples: 8174
- name: test
num_bytes: 2313629
num_examples: 8176
download_size: 8200395
dataset_size: 23006986
---
# Dataset Card for "roco-instruct-65k"
## Dataset Description
- **Repository:** [ROCO GitHub Repository](https://github.com/razorx89/roco-dataset)
- **Paper:** [Radiology Objects in COntext (ROCO) dataset](https://labels.tue-image.nl/wp-content/uploads/2018/09/AM-04.pdf)
- **Point of Contact:** ROCO's original authors
### Dataset Summary
The "roco-instruct-65k" dataset is derived from the Radiology Objects in COntext (ROCO) dataset, a large-scale medical and multimodal imaging collection. The images are taken from publications available on the PubMed Central Open Access FTP mirror. The dataset was reformatted for the [LLaVA model](https://llava-vl.github.io/) in the [BabyDoctor project](https://github.com/photomz/BabyDoctor), focusing on deep analysis and diagnosis of radiology images. It includes captions, keywords, UMLS Semantic Types (SemTypes), and UMLS Concept Unique Identifiers (CUIs), and supports the creation of generative models for image captioning, classification models for image categorization, and tagging or content-based image retrieval systems. The language used is primarily English, and it covers the domain of medical imaging, specifically radiology.
### Supported Tasks and Leaderboards
- `image-classification`: The dataset can be used to train models for image classification, which involves categorizing images as either radiology or non-radiology. Success on this task is typically measured by achieving a high accuracy. This task has an active leaderboard which can be found at [ImageCLEFmed Caption 2019 and CrowdAI](https://www.imageclef.org/2019/medical/caption).
### Languages
The dataset consists entirely of medical texts in English.
## Dataset Structure
### Data Instances
The dataset is structured in a conversation format where a human provides an image with instructions for analysis, and a model responds with a diagnosis. A typical instance in the dataset looks like:
```json
{
'conversations': [
{ "from": "human", "value": "The following image is a radiology scan. Deeply analyze and diagnose this image.\n<image>" },
{ "from": "gpt", "value": "Computed tomography scan in axial view showing obliteration of the left maxillary sinus" }
],
'image': "ROCO_00002.jpg",
'id': "00002"
}
```
### Data Fields
- `conversations`: A list containing the interaction between a human and a model regarding the image.
- `image`: A string containing the name of the image file.
- `id`: A string representing the unique identifier for the interaction.
### Data Splits
The dataset is divided into training, validation, and test sets. The exact split sizes are:
| | train | validation | test |
|-----------------|-------:|-----------:|------:|
| Data Instances | 65000| 8200 | 8200 |
## Dataset Creation
### Curation Rationale
The "roco-instruct-65k" dataset was created to foster the development of AI models capable of performing deep analysis and diagnosis on radiology images, an essential step in automating medical imaging interpretation.
### Citation Information
[@photomz](https://github.com/photomz) uploaded this dataset to HuggingFace. Please cite the original ROCO paper when using this dataset.
```
O. Pelka, S. Koitka, J. Rückert, F. Nensa, C.M. Friedrich,
"Radiology Objects in COntext (ROCO): A Multimodal Image Dataset".
MICCAI Workshop on Large-scale Annotation of Biomedical Data and Expert Label Synthesis (LABELS) 2018, September 16, 2018, Granada, Spain. Lecture Notes on Computer Science (LNCS), vol. 11043, pp. 180-189, Springer Cham, 2018.
doi: 10.1007/978-3-030-01364-6_20
``` |
Terdem/Cem_Adrian | ---
license: openrail
---
|
NASP/neteval-exam | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
- question-answering
- multiple-choice
language:
- en
- zh
pretty_name: Netops
size_categories:
- 10K<n<100K
---
NetEval is a NetOps evaluation suite for foundation models, consisting of 5269 multi-choice questions. Please check [our paper](https://arxiv.org/abs/2309.05557) for more details about NetEval.
We hope NetEval could help developers track the progress and analyze the NetOps ability of their models.
## Citation
Please cite our paper if you use our dataset.
```
@misc{miao2023empirical,
title={An Empirical Study of NetOps Capability of Pre-Trained Large Language Models},
author={Yukai Miao and Yu Bai and Li Chen and Dan Li and Haifeng Sun and Xizheng Wang and Ziqiu Luo and Dapeng Sun and Xiuting Xu and Qi Zhang and Chao Xiang and Xinchi Li},
year={2023},
eprint={2309.05557},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
bertbsb/HerbeetVanderley | ---
license: openrail
---
|
open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v18.1-4k | ---
pretty_name: Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v18.1-4k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenBuddy/openbuddy-deepseek-67b-v18.1-4k](https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v18.1-4k)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v18.1-4k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-18T20:55:27.550442](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v18.1-4k/blob/main/results_2024-02-18T20-55-27.550442.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7058260280059325,\n\
\ \"acc_stderr\": 0.030134629260569593,\n \"acc_norm\": 0.7076734462849897,\n\
\ \"acc_norm_stderr\": 0.03073727698082304,\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.01710658814070033,\n \"mc2\": 0.5565901681593471,\n\
\ \"mc2_stderr\": 0.015389712051681206\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726096,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277371\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.655646285600478,\n\
\ \"acc_stderr\": 0.0047418597531784295,\n \"acc_norm\": 0.8465445130452102,\n\
\ \"acc_norm_stderr\": 0.0035968938961909126\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7631578947368421,\n \"acc_stderr\": 0.03459777606810537,\n\
\ \"acc_norm\": 0.7631578947368421,\n \"acc_norm_stderr\": 0.03459777606810537\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n\
\ \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.82,\n \
\ \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.026199808807561915,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.026199808807561915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7276595744680852,\n \"acc_stderr\": 0.029101290698386715,\n\
\ \"acc_norm\": 0.7276595744680852,\n \"acc_norm_stderr\": 0.029101290698386715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n\
\ \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5238095238095238,\n \"acc_stderr\": 0.025722097064388525,\n \"\
acc_norm\": 0.5238095238095238,\n \"acc_norm_stderr\": 0.025722097064388525\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329276,\n \"\
acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329276\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5615763546798029,\n \"acc_stderr\": 0.03491207857486519,\n \"\
acc_norm\": 0.5615763546798029,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562097,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562097\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853106,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853106\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9637305699481865,\n \"acc_stderr\": 0.013492659751295153,\n\
\ \"acc_norm\": 0.9637305699481865,\n \"acc_norm_stderr\": 0.013492659751295153\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.02300062824368797,\n \
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.02300062824368797\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497593,\n \
\ \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8151260504201681,\n \"acc_stderr\": 0.025215992877954202,\n\
\ \"acc_norm\": 0.8151260504201681,\n \"acc_norm_stderr\": 0.025215992877954202\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611759,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611759\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6157407407407407,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.6157407407407407,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316942,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316942\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.021331741829746793,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.021331741829746793\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7847533632286996,\n\
\ \"acc_stderr\": 0.027584066602208274,\n \"acc_norm\": 0.7847533632286996,\n\
\ \"acc_norm_stderr\": 0.027584066602208274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.859504132231405,\n \"acc_stderr\": 0.03172233426002158,\n \"acc_norm\"\
: 0.859504132231405,\n \"acc_norm_stderr\": 0.03172233426002158\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.03680918141673881,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.03680918141673881\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673144,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673144\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8914431673052363,\n\
\ \"acc_stderr\": 0.011124283175851183,\n \"acc_norm\": 0.8914431673052363,\n\
\ \"acc_norm_stderr\": 0.011124283175851183\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7630057803468208,\n \"acc_stderr\": 0.02289408248992599,\n\
\ \"acc_norm\": 0.7630057803468208,\n \"acc_norm_stderr\": 0.02289408248992599\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4223463687150838,\n\
\ \"acc_stderr\": 0.016519594275297117,\n \"acc_norm\": 0.4223463687150838,\n\
\ \"acc_norm_stderr\": 0.016519594275297117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.022733789405447586,\n\
\ \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.022733789405447586\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8135048231511254,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.8135048231511254,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.020736358408060006,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.020736358408060006\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5495436766623207,\n\
\ \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.5495436766623207,\n\
\ \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103128,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103128\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7777777777777778,\n \"acc_stderr\": 0.01681902837573638,\n \
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.01681902837573638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7591836734693878,\n \"acc_stderr\": 0.02737294220178816,\n\
\ \"acc_norm\": 0.7591836734693878,\n \"acc_norm_stderr\": 0.02737294220178816\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \
\ \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39412484700122397,\n\
\ \"mc1_stderr\": 0.01710658814070033,\n \"mc2\": 0.5565901681593471,\n\
\ \"mc2_stderr\": 0.015389712051681206\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825905\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.012714401009923645\n }\n}\n```"
repo_url: https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v18.1-4k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|arc:challenge|25_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|gsm8k|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hellaswag|10_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-55-27.550442.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-18T20-55-27.550442.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- '**/details_harness|winogrande|5_2024-02-18T20-55-27.550442.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-18T20-55-27.550442.parquet'
- config_name: results
data_files:
- split: 2024_02_18T20_55_27.550442
path:
- results_2024-02-18T20-55-27.550442.parquet
- split: latest
path:
- results_2024-02-18T20-55-27.550442.parquet
---
# Dataset Card for Evaluation run of OpenBuddy/openbuddy-deepseek-67b-v18.1-4k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OpenBuddy/openbuddy-deepseek-67b-v18.1-4k](https://huggingface.co/OpenBuddy/openbuddy-deepseek-67b-v18.1-4k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v18.1-4k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-18T20:55:27.550442](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenBuddy__openbuddy-deepseek-67b-v18.1-4k/blob/main/results_2024-02-18T20-55-27.550442.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7058260280059325,
"acc_stderr": 0.030134629260569593,
"acc_norm": 0.7076734462849897,
"acc_norm_stderr": 0.03073727698082304,
"mc1": 0.39412484700122397,
"mc1_stderr": 0.01710658814070033,
"mc2": 0.5565901681593471,
"mc2_stderr": 0.015389712051681206
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726096,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277371
},
"harness|hellaswag|10": {
"acc": 0.655646285600478,
"acc_stderr": 0.0047418597531784295,
"acc_norm": 0.8465445130452102,
"acc_norm_stderr": 0.0035968938961909126
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7631578947368421,
"acc_stderr": 0.03459777606810537,
"acc_norm": 0.7631578947368421,
"acc_norm_stderr": 0.03459777606810537
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.026199808807561915,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.026199808807561915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7276595744680852,
"acc_stderr": 0.029101290698386715,
"acc_norm": 0.7276595744680852,
"acc_norm_stderr": 0.029101290698386715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5238095238095238,
"acc_stderr": 0.025722097064388525,
"acc_norm": 0.5238095238095238,
"acc_norm_stderr": 0.025722097064388525
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329276,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329276
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5615763546798029,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.5615763546798029,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562097,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562097
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853106,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853106
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9637305699481865,
"acc_stderr": 0.013492659751295153,
"acc_norm": 0.9637305699481865,
"acc_norm_stderr": 0.013492659751295153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.02300062824368797,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.02300062824368797
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3814814814814815,
"acc_stderr": 0.029616718927497593,
"acc_norm": 0.3814814814814815,
"acc_norm_stderr": 0.029616718927497593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8151260504201681,
"acc_stderr": 0.025215992877954202,
"acc_norm": 0.8151260504201681,
"acc_norm_stderr": 0.025215992877954202
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611759,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611759
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6157407407407407,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.6157407407407407,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316942,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316942
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.021331741829746793,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.021331741829746793
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7847533632286996,
"acc_stderr": 0.027584066602208274,
"acc_norm": 0.7847533632286996,
"acc_norm_stderr": 0.027584066602208274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.859504132231405,
"acc_stderr": 0.03172233426002158,
"acc_norm": 0.859504132231405,
"acc_norm_stderr": 0.03172233426002158
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.03680918141673881,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.03680918141673881
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673144,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673144
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8914431673052363,
"acc_stderr": 0.011124283175851183,
"acc_norm": 0.8914431673052363,
"acc_norm_stderr": 0.011124283175851183
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.02289408248992599,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.02289408248992599
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4223463687150838,
"acc_stderr": 0.016519594275297117,
"acc_norm": 0.4223463687150838,
"acc_norm_stderr": 0.016519594275297117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.022733789405447586,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.022733789405447586
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8135048231511254,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.8135048231511254,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.020736358408060006,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.020736358408060006
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5495436766623207,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.5495436766623207,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103128,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103128
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.01681902837573638,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.01681902837573638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7591836734693878,
"acc_stderr": 0.02737294220178816,
"acc_norm": 0.7591836734693878,
"acc_norm_stderr": 0.02737294220178816
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39412484700122397,
"mc1_stderr": 0.01710658814070033,
"mc2": 0.5565901681593471,
"mc2_stderr": 0.015389712051681206
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825905
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923645
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mazkobot/0_digit_mask_ensemble_distilled_from_cv12_balanced_mfcc | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 25771854448.0
num_examples: 5061244
download_size: 26296785478
dataset_size: 25771854448.0
---
# Dataset Card for "0_digit_mask_ensemble_distilled_from_cv12_balanced_mfcc"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atom-in-the-universe/cc-faces-150k | ---
license: apache-2.0
---
# URLs of images containing faces from Common Crawl

## Download images
Select only first face from image and faces with min size of 40 pixels:
```python
from datasets import load_dataset
def filter_bbox(bbox, min_size=40):
x1, x2, y1, y2 = bbox
return x2 - x1 >= min_size and y2 - y1 >= min_size
ds = load_dataset('atom-in-the-universe/cc-faces-150k')
ds = ds.map(lambda sample: {'faces': sample['faces'][0]})
ds = ds.filter(lambda sample: filter_bbox(sample['faces']))
ds.to_parquet('cc_faces.parquet')
```
## Download using img2dataset
Install Vanga's fork of img2dataset:
```bash
pip install img2dataset git+https://github.com/vanga/img2dataset.git
```
Python script:
```python
from img2dataset import download
import os
output_dir = os.path.abspath("bench")
download(
processes_count=16,
thread_count=32,
url_list="cc_faces.parquet",
image_size=256,
output_folder=output_dir,
output_format="files",
input_format="parquet",
url_col="url",
caption_col="alt",
enable_wandb=True,
number_sample_per_shard=1000,
distributor="multiprocessing",
box_col='faces
)
``` |
Codec-SUPERB/beehive_states_synth | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: id
dtype: string
splits:
- name: original
num_bytes: 33177655008.0
num_examples: 576
- name: academicodec_hifi_16k_320d
num_bytes: 11059255008.0
num_examples: 576
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 11059255008.0
num_examples: 576
- name: academicodec_hifi_24k_320d
num_bytes: 16588855008.0
num_examples: 576
- name: dac_16k
num_bytes: 11059255008.0
num_examples: 576
- name: dac_24k
num_bytes: 16588855008.0
num_examples: 576
- name: dac_44k
num_bytes: 30481975008.0
num_examples: 576
- name: encodec_24k
num_bytes: 16588855008.0
num_examples: 576
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 11059255008.0
num_examples: 576
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 11059255008.0
num_examples: 576
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 11059255008.0
num_examples: 576
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 11059255008.0
num_examples: 576
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 11059255008.0
num_examples: 576
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 11059255008.0
num_examples: 576
- name: speech_tokenizer_16k
num_bytes: 11059255008.0
num_examples: 576
download_size: 217514074682
dataset_size: 224018745120.0
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
---
|
nelson2424/Chess_openings_dataset | ---
license: mit
task_categories:
- text-classification
- text-generation
- text2text-generation
language:
- en
pretty_name: Cot-dataset
---
# Version 1 of the dataset
## Structure of the dataset:
- ### Opening_type:
The title of the opening being played.
- ### Context:
A string representing a list of moves, each move is represented by the previous state of the board, the move that is going to be made, and the effect that the move had on the board.<br>
- The <b>board</b> is represented as an 8*8 grid of characters where each character represents a piece or an empty square:<br>
~~~
r . . q k b n r
p p p . p . p p
. . n . . p . .
. . . p . b . .
. . . P . . . B
. . . . P N . .
P P P . . P P P
R N . Q K B . R
~~~
- The <b>move</b> is represented by the uci format g8f6, specifying that the piece is square g8 moves to the square f6
- The <b>type of move</b> is represented by a list of integers separated by ',' where each integer represents the effect that the move will have on the board.
- 0 if it is a move without capture
- 1 if it is a move with capture
- 2 if a check is being made
- 3 if it is check mate
- 4 if it is en passant capture
- 5 if it is king side castling
- 6 if it is queenside castling
- 7 if it is a draw by stalemate
- 8 if there is a draw by insufficient material
- 9 if it is a draw by seventy-five moves rule
- 10 if it is a draw by fivefold repetition
- The whole context can look something like this:
After each board, there is a move, and the effect of the move generates the next board. A context list always ends with a board because the following two columns represent the move to be played and the effect that it'll have on the next board.
~~~
r . . q k b n r
p p p . p . p p
. . n . . p . .
. . . p . b . .
. . . P . . . B
. . . . P N . .
P P P . . P P P
R N . Q K B . R
m:e7e5
t:0
r . . q k b n r
p p p . . . p p
. . n . . p . .
. . . p p b . .
. . . P . . . B
. . . . P N . .
P P P . . P P P
R N . Q K B . R
m:d4e5
t:1
r . . q k b n r
p p p . . . p p
. . n . . p . .
. . . p P b . .
. . . . . . . B
. . . . P N . .
P P P . . P P P
R N . Q K B . R
m:f6e5
t:1
r . . q k b n r
p p p . . . p p
. . n . . . . .
. . . p p b . .
. . . . . . . B
. . . . P N . .
P P P . . P P P
R N . Q K B . R
~~~
- ## Move_type_pred:
- Follows the same format described in the context column with <b>Type move</b>
- ## Move_pred:
- Follows the same format described in the context column with <b>move</b>
## Creation process:
- ### Loading the Dataset:
- The code loads a dataset of chess games in PGN format using the Hugging Face **datasets** library. The dataset is called [patrickfrank1/chess-pgn-games](https://huggingface.co/datasets/patrickfrank1/chess-pgn-games)!.
- ### Parsing and Organizing Game Text:
- It extracts game text from the dataset and organizes it based on metadata and moves information.
- ### Parsing Game Information:
- It extracts relevant information from the game headers, such as player Elo ratings and opening names.
- ### Iterating Through Games:
- It iterates through each game and processes it if it has a specified opening and if at least one player has an Elo rating greater than 1700.
- ### Sampling Moves for Context:
- For selected games, it randomly samples subarrays of moves from the mainline of the game.
- ### Recording Context Information:
- It records the board state, move information, and move type prediction for each move in the sampled context.
- ### Storing Processed Data:
- The extracted information is stored in a dictionary and then converted to a data frame. The data frame is uploaded to the Huggingface Dataset hub. (As you can see)
- The code to create this dataset can be found here: [chess_openings_teacher/ML/Dataset_Creation](https://github.com/bit2424/chess_openings_teacher/tree/main/ML/Dataset_Creation)!
## Intuitions behind the design:
- The idea is that by creating the whole board grid, the model can learn to grasp the effect that a move has on the board and create a richer representation of the game.
- One of the aims of this representation is to help predict logical moves even without needing the game's history, just using the current state of the board in the grid representation. |
Brendan/PMUL4976_only_dataset | ---
dataset_info:
features:
- name: dialogue_id
dtype: string
- name: turn_id
dtype: int8
- name: domains
sequence: string
- name: system_utterances
sequence: string
- name: user_utterances
sequence: string
- name: slot_values
struct:
- name: hotel
struct:
- name: price range
dtype: string
- name: type
dtype: string
- name: parking
dtype: string
- name: book day
dtype: string
- name: book people
dtype: string
- name: book stay
dtype: string
- name: stars
dtype: string
- name: internet
dtype: string
- name: name
dtype: string
- name: area
dtype: string
- name: train
struct:
- name: arrive by
dtype: string
- name: departure
dtype: string
- name: day
dtype: string
- name: book people
dtype: string
- name: leave at
dtype: string
- name: destination
dtype: string
- name: attraction
struct:
- name: area
dtype: string
- name: name
dtype: string
- name: type
dtype: string
- name: restaurant
struct:
- name: price range
dtype: string
- name: area
dtype: string
- name: food
dtype: string
- name: name
dtype: string
- name: book day
dtype: string
- name: book people
dtype: string
- name: book time
dtype: string
- name: hospital
struct:
- name: department
dtype: string
- name: taxi
struct:
- name: leave at
dtype: string
- name: destination
dtype: string
- name: departure
dtype: string
- name: arrive by
dtype: string
- name: bus
struct:
- name: departure
dtype: string
- name: destination
dtype: string
- name: leave at
dtype: string
- name: day
dtype: string
- name: police
struct:
- name: name
dtype: string
- name: turn_slot_values
struct:
- name: hotel
struct:
- name: price range
dtype: string
- name: type
dtype: string
- name: parking
dtype: string
- name: book day
dtype: string
- name: book people
dtype: string
- name: book stay
dtype: string
- name: stars
dtype: string
- name: internet
dtype: string
- name: name
dtype: string
- name: area
dtype: string
- name: train
struct:
- name: arrive by
dtype: string
- name: departure
dtype: string
- name: day
dtype: string
- name: book people
dtype: string
- name: leave at
dtype: string
- name: destination
dtype: string
- name: attraction
struct:
- name: area
dtype: string
- name: name
dtype: string
- name: type
dtype: string
- name: restaurant
struct:
- name: price range
dtype: string
- name: area
dtype: string
- name: food
dtype: string
- name: name
dtype: string
- name: book day
dtype: string
- name: book people
dtype: string
- name: book time
dtype: string
- name: hospital
struct:
- name: department
dtype: string
- name: taxi
struct:
- name: leave at
dtype: string
- name: destination
dtype: string
- name: departure
dtype: string
- name: arrive by
dtype: string
- name: bus
struct:
- name: departure
dtype: string
- name: destination
dtype: string
- name: leave at
dtype: string
- name: day
dtype: string
- name: police
struct:
- name: name
dtype: string
- name: last_slot_values
struct:
- name: hotel
struct:
- name: price range
dtype: string
- name: type
dtype: string
- name: parking
dtype: string
- name: book day
dtype: string
- name: book people
dtype: string
- name: book stay
dtype: string
- name: stars
dtype: string
- name: internet
dtype: string
- name: name
dtype: string
- name: area
dtype: string
- name: train
struct:
- name: arrive by
dtype: string
- name: departure
dtype: string
- name: day
dtype: string
- name: book people
dtype: string
- name: leave at
dtype: string
- name: destination
dtype: string
- name: attraction
struct:
- name: area
dtype: string
- name: name
dtype: string
- name: type
dtype: string
- name: restaurant
struct:
- name: price range
dtype: string
- name: area
dtype: string
- name: food
dtype: string
- name: name
dtype: string
- name: book day
dtype: string
- name: book people
dtype: string
- name: book time
dtype: string
- name: hospital
struct:
- name: department
dtype: string
- name: taxi
struct:
- name: leave at
dtype: string
- name: destination
dtype: string
- name: departure
dtype: string
- name: arrive by
dtype: string
- name: bus
struct:
- name: departure
dtype: string
- name: destination
dtype: string
- name: leave at
dtype: string
- name: day
dtype: string
- name: police
struct:
- name: name
dtype: string
- name: system_response_acts
sequence: string
- name: system_response
dtype: string
splits:
- name: valid_evaluable_only
num_bytes: 15490.408443056875
num_examples: 11
download_size: 59035
dataset_size: 15490.408443056875
---
# Dataset Card for "PMUL4976_only_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgallouedec/prj_gia_dataset_metaworld_assembly_v2_1112 | ---
dataset_info:
features:
- name: observations
sequence: float32
- name: actions
sequence: float32
- name: dones
dtype: bool
- name: rewards
dtype: float32
splits:
- name: train
num_bytes: 18412500
num_examples: 100000
download_size: 7293153
dataset_size: 18412500
---
# Dataset Card for "prj_gia_dataset_metaworld_assembly_v2_1112"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ucla-cmllab/WizardLM_evol_instruct_V2_196k-chat-format | ---
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: idx
dtype: string
splits:
- name: train
num_bytes: 337488957
num_examples: 143000
download_size: 0
dataset_size: 337488957
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "WizardLM_evol_instruct_V2_196k-chat-format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlekseyKorshuk/hh-lmgym-demo | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 126803175
num_examples: 112052
- name: test
num_bytes: 14079595
num_examples: 12451
download_size: 0
dataset_size: 140882770
---
# Dataset Card for "hh-lmgym-demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
khoomeik/gzipscale-code-html-256M | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 1028001028
num_examples: 1000001
download_size: 263145842
dataset_size: 1028001028
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Cheetor1996/mountain_tribe_girls | ---
license: cc-by-2.0
language:
- en
tags:
- art
---
*"In the depths of a lush forest, nestled near majestic mountains, thrives an extraordinary female-only tribe. This enchanting community boasts a unique blend of captivating features, as they are distinguished by their striking blonde hair, mesmerizing orange eyes, and a complexion kissed by the sun, adorned with rich dark skin.
Living harmoniously with nature, this tribe has found solace in the vibrant forest that surrounds them. Within their domain, they are blessed with wondrous gifts of the land. Among the wonders that grace their home are natural hot springs, where warm waters rejuvenate their spirits and provide a sanctuary for reflection and relaxation. The forest is also adorned with cherry blossom trees, which burst into bloom each spring, transforming the tribe's surroundings into a surreal canvas of delicate petals in hues of pink and white.
Beyond their physical allure, the women of this tribe possess an inherent charm that captivates all who encounter them. Their allure and sensuality emanate from their deep connection with their surroundings, as they navigate the forest with grace and embrace the natural rhythms of life. It is through this symbiotic relationship with nature that they have honed their mystique, exuding a magnetic presence that leaves a lasting impression on those fortunate enough to witness their radiance.
In this hidden corner of the world, the female-only tribe reigns as guardians of the forest, cherishing its beauty and protecting its secrets. As they wander through the verdant landscape, their presence is an embodiment of the untamed spirit of the wilderness, seamlessly merging their ethereal beauty with the captivating nature that surrounds them."*
Trained with Anime (full-final-pruned) model, using images generated from Waifulabs.com
Activation tags: **mountain tribe** (for general info), and stock character names (the ones founds at the image here) to get an specific design. You may also make your own OC's with this.
Recommended LoRA weight blocks: OUTD and OUTALL (you can still use ALL and MIDD but can be messy, use on your own risk.)
Recommended weights: **0.7 - 1.0** |
Akshita15/blaupunkt_data | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Product Queries
'1': Product shipping
'2': bank emi
'3': cancel order
'4': complain
'5': courier products
'6': discount code
'7': exchange offer
'8': invoice
'9': payment
'10': promo coupon
'11': redeem voucher
'12': replace
'13': return
'14': service center
'15': tickets
'16': warranty
splits:
- name: train
num_bytes: 80209
num_examples: 877
download_size: 0
dataset_size: 80209
---
# Dataset Card for "blaupunkt_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_248 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 918744468.0
num_examples: 180429
download_size: 935680843
dataset_size: 918744468.0
---
# Dataset Card for "chunk_248"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Liiizt9/230512pics | ---
license: openrail
---
|
open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1 | ---
pretty_name: Evaluation run of ziniuli/Mistral-7B-ReMax-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ziniuli/Mistral-7B-ReMax-v0.1](https://huggingface.co/ziniuli/Mistral-7B-ReMax-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T18:07:55.437908](https://huggingface.co/datasets/open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1/blob/main/results_2024-03-11T18-07-55.437908.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6076779028291911,\n\
\ \"acc_stderr\": 0.03315115503196382,\n \"acc_norm\": 0.61214149482687,\n\
\ \"acc_norm_stderr\": 0.03382378734979988,\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6815516476213737,\n\
\ \"mc2_stderr\": 0.015177768821414346\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268445,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104301\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6692889862577176,\n\
\ \"acc_stderr\": 0.004695076629884538,\n \"acc_norm\": 0.8498307110137423,\n\
\ \"acc_norm_stderr\": 0.0035650718701954473\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968351,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968351\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630797,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630797\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709583,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348408,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348408\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401464,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401464\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6815516476213737,\n\
\ \"mc2_stderr\": 0.015177768821414346\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3957543593631539,\n \
\ \"acc_stderr\": 0.01346982370104881\n }\n}\n```"
repo_url: https://huggingface.co/ziniuli/Mistral-7B-ReMax-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|arc:challenge|25_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|gsm8k|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hellaswag|10_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|winogrande|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|winogrande|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T18-07-55.437908.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- results_2024-03-09T23-05-10.060154.parquet
- split: 2024_03_11T18_07_55.437908
path:
- results_2024-03-11T18-07-55.437908.parquet
- split: latest
path:
- results_2024-03-11T18-07-55.437908.parquet
---
# Dataset Card for Evaluation run of ziniuli/Mistral-7B-ReMax-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ziniuli/Mistral-7B-ReMax-v0.1](https://huggingface.co/ziniuli/Mistral-7B-ReMax-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T18:07:55.437908](https://huggingface.co/datasets/open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1/blob/main/results_2024-03-11T18-07-55.437908.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6076779028291911,
"acc_stderr": 0.03315115503196382,
"acc_norm": 0.61214149482687,
"acc_norm_stderr": 0.03382378734979988,
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6815516476213737,
"mc2_stderr": 0.015177768821414346
},
"harness|arc:challenge|25": {
"acc": 0.5964163822525598,
"acc_stderr": 0.014337158914268445,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104301
},
"harness|hellaswag|10": {
"acc": 0.6692889862577176,
"acc_stderr": 0.004695076629884538,
"acc_norm": 0.8498307110137423,
"acc_norm_stderr": 0.0035650718701954473
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968351,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968351
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630797,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709583,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348408,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348408
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401464,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401464
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6815516476213737,
"mc2_stderr": 0.015177768821414346
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698338
},
"harness|gsm8k|5": {
"acc": 0.3957543593631539,
"acc_stderr": 0.01346982370104881
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DiegoRoberto10/diegoroberto | ---
license: openrail
---
|
arthurmluz/cstnews_data-xlsum_temario_results | ---
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 56230
num_examples: 16
download_size: 53610
dataset_size: 56230
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "cstnews_data-xlsum_temario_results"
rouge= {'rouge1': 0.47212177671448063, 'rouge2': 0.2811985678373053, 'rougeL': 0.348694400169423, 'rougeLsum': 0.348694400169423}
bert= {'precision': 0.7867038622498512, 'recall': 0.7567419111728668, 'f1': 0.7705440074205399}
mover = 0.6284352769565635 |
AdapterOcean/python3-standardized_cluster_21 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 7573658
num_examples: 667
download_size: 1501886
dataset_size: 7573658
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "python3-standardized_cluster_21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MananSantoki/vadodara-jsonl-converted | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 103428
num_examples: 410
download_size: 40823
dataset_size: 103428
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vadodara-jsonl-converted"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/turbotoconvert | ---
license: mit
---
|
AliAsh/digikala_translated_small_5m | ---
language:
- fa
pretty_name: digikala-5m
size_categories:
- 1B<n<10B
---
# Digikala Dataset Small 5m
- digikala product titles translated by standard google translate api
- category and brand english translation might be invalid but title_en checked
- |
ljvmiranda921/tlunified-ner | ---
license: gpl-3.0
task_categories:
- token-classification
task_ids:
- named-entity-recognition
language:
- tl
size_categories:
- 1K<n<10K
pretty_name: TLUnified-NER
tags:
- low-resource
- named-entity-recognition
annotations_creators:
- expert-generated
multilinguality:
- monolingual
train-eval-index:
- config: conllpp
task: token-classification
task_id: entity_extraction
splits:
train_split: train
eval_split: test
col_mapping:
tokens: tokens
ner_tags: tags
metrics:
- type: seqeval
name: seqeval
---
<!-- SPACY PROJECT: AUTO-GENERATED DOCS START (do not remove) -->
# 🪐 spaCy Project: TLUnified-NER Corpus
- **Homepage:** [Github](https://github.com/ljvmiranda921/calamanCy)
- **Repository:** [Github](https://github.com/ljvmiranda921/calamanCy)
- **Point of Contact:** ljvmiranda@gmail.com
### Dataset Summary
This dataset contains the annotated TLUnified corpora from Cruz and Cheng
(2021). It is a curated sample of around 7,000 documents for the
named entity recognition (NER) task. The majority of the corpus are news
reports in Tagalog, resembling the domain of the original ConLL 2003. There
are three entity types: Person (PER), Organization (ORG), and Location (LOC).
| Dataset | Examples | PER | ORG | LOC |
|-------------|----------|------|------|------|
| Train | 6252 | 6418 | 3121 | 3296 |
| Development | 782 | 793 | 392 | 409 |
| Test | 782 | 818 | 423 | 438 |
### Data Fields
The data fields are the same among all splits:
- `id`: a `string` feature
- `tokens`: a `list` of `string` features.
- `ner_tags`: a `list` of classification labels, with possible values including `O` (0), `B-PER` (1), `I-PER` (2), `B-ORG` (3), `I-ORG` (4), `B-LOC` (5), `I-LOC` (6)
### Annotation process
The author, together with two more annotators, labeled curated portions of
TLUnified in the course of four months. All annotators are native speakers of
Tagalog. For each annotation round, the annotators resolved disagreements,
updated the annotation guidelines, and corrected past annotations. They
followed the process prescribed by [Reiters
(2017)](https://nilsreiter.de/blog/2017/howto-annotation).
They also measured the inter-annotator agreement (IAA) by computing pairwise
comparisons and averaging the results:
- Cohen's Kappa (all tokens): 0.81
- Cohen's Kappa (annotated tokens only): 0.65
- F1-score: 0.91
### About this repository
This repository is a [spaCy project](https://spacy.io/usage/projects) for
converting the annotated spaCy files into IOB. The process goes like this: we
download the raw corpus from Google Cloud Storage (GCS), convert the spaCy
files into a readable IOB format, and parse that using our loading script
(i.e., `tlunified-ner.py`). We're also shipping the IOB file so that it's
easier to access.
## 📋 project.yml
The [`project.yml`](project.yml) defines the data assets required by the
project, as well as the available commands and workflows. For details, see the
[spaCy projects documentation](https://spacy.io/usage/projects).
### ⏯ Commands
The following commands are defined by the project. They
can be executed using [`spacy project run [name]`](https://spacy.io/api/cli#project-run).
Commands are only re-run if their inputs have changed.
| Command | Description |
| --- | --- |
| `setup-data` | Prepare the Tagalog corpora used for training various spaCy components |
| `upload-to-hf` | Upload dataset to HuggingFace Hub |
### ⏭ Workflows
The following workflows are defined by the project. They
can be executed using [`spacy project run [name]`](https://spacy.io/api/cli#project-run)
and will run the specified commands in order. Commands are only re-run if their
inputs have changed.
| Workflow | Steps |
| --- | --- |
| `all` | `setup-data` → `upload-to-hf` |
### 🗂 Assets
The following assets are defined by the project. They can
be fetched by running [`spacy project assets`](https://spacy.io/api/cli#project-assets)
in the project directory.
| File | Source | Description |
| --- | --- | --- |
| `assets/corpus.tar.gz` | URL | Annotated TLUnified corpora in spaCy format with train, dev, and test splits. |
<!-- SPACY PROJECT: AUTO-GENERATED DOCS END (do not remove) -->
### Citation
You can cite this dataset as:
```
@misc{miranda2023developing,
title={Developing a Named Entity Recognition Dataset for Tagalog},
author={Lester James V. Miranda},
year={2023},
eprint={2311.07161},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
AlexWortega/EVILdolly | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: q
dtype: string
- name: a
dtype: string
splits:
- name: train
num_bytes: 9668252
num_examples: 15012
download_size: 6313247
dataset_size: 9668252
license: cc-by-sa-3.0
task_categories:
- question-answering
- summarization
language:
- en
size_categories:
- 10K<n<100K
---
# Summary
`EVILDolly` is an open source dataset of instruction-following records with wrong answers derived from [databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k).
The dataset includes answers that are wrong, but appear to be correct and reasonable. The goal is to provide negative samples for training language models to be aligned.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the
[Creative Commons Attribution-ShareAlike 3.0 Unported License](https://creativecommons.org/licenses/by-sa/3.0/legalcode).
|
autoevaluate/autoeval-staging-eval-project-2e072638-8015093 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- catalonia_independence
eval_info:
task: multi_class_classification
model: JonatanGk/roberta-base-ca-finetuned-catalonia-independence-detector
metrics: []
dataset_name: catalonia_independence
dataset_config: catalan
dataset_split: test
col_mapping:
text: TWEET
target: LABEL
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: JonatanGk/roberta-base-ca-finetuned-catalonia-independence-detector
* Dataset: catalonia_independence
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
autoevaluate/autoeval-staging-eval-project-samsum-0c672345-10275366 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: knkarthick/bart-large-xsum-samsum
metrics: []
dataset_name: samsum
dataset_config: samsum
dataset_split: train
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: knkarthick/bart-large-xsum-samsum
* Dataset: samsum
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@ikadebi](https://huggingface.co/ikadebi) for evaluating this model. |
zolak/twitter_dataset_78_1713073648 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 3256989
num_examples: 8041
download_size: 1632745
dataset_size: 3256989
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
EdwardLin2023/AESDD | ---
license: cc-by-4.0
---
# Acted Emotional Speech Dynamic Database v1.0
## ABOUT
AESDD v1.0 was created on October 2017 in the Laboratory of Electronic Media, School of
Journalism and Mass Communications, Aristotle University of Thessaloniki, for
the needs of Speech Emotion Recognition research of the Multidisciplinary Media &
Mediated Communication Research Group (M3C, http://m3c.web.auth.gr/).
It is a collection of utterances of emotional speech acted by professional actors.
This version is the initial state of AESDD. The purpose of this project the continuous
growth of the database through the collaborative effort of the M3C research group and
theatrical teams.
## CREATION OF THE DATABASE
For the creation of v.1 of the database, 5 (3 female and 2 male) professional actors were
recorded. 19 utterances of ambiguous out of context emotional content were chosen. The
actors acted these 19 utterances in every one of the 5 chosen emotions. One extra improvised
utterance was added for every actor and emotion. The guidance of the actors and the choice
of the final recordings were supervised by a scientific expert in dramatology.
For some of the utterances, more that one takes were qualified.
Consequently, around 500 utterances occured in the final database.
UPDATE: Since the AESDD is dynamic by definition, more actors have been recorded and added,
following the same naming scheme as described in the Section "ORGANISING THE DATABASE"
## CHOSEN EMOTIONS
Five emotions were chosen:
1. a (anger)
2. d (disgust)
3. f (fear)
4. h (happiness)
5. s (sadness)
## ORGANISING THE DATABASE
There are five folders, named after the five emotion classes.
Every file name in the databased is in the following form: xAA (B)
where
- x is the first letter of the emotion (a--> anger, h--> happiness etc.)
- AA is the number of the utterance (01,02...20)
- B is the number of the speaker (1 --> 1st speaker, 2 --> 2nd speaker etc)
e.g. 'a03 (4).wav' is the 3rd utterance spoken by the 4th speaker with anger
In the case where two takes were qualified for the same utterance, they are distinguished
with a lower case letter.
e.g. 'f18 (5).wav' and 'f18 (5)b.wav' are two different versions of the 5th actor saying the
18th utterance with fear.
## References
1. Vryzas, N., Kotsakis, R., Liatsou, A., Dimoulas, C. A., & Kalliris, G. (2018). Speech emotion recognition for performance interaction. Journal of the Audio Engineering Society, 66(6), 457-467.
2. Vryzas, N., Matsiola, M., Kotsakis, R., Dimoulas, C., & Kalliris, G. (2018, September). Subjective Evaluation of a Speech Emotion Recognition Interaction Framework. In Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion (p. 34). ACM.
|
CyberHarem/larchel_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of larchel (Fire Emblem)
This is the dataset of larchel (Fire Emblem), containing 75 images and their tags.
The core tags of this character are `green_hair, green_eyes, breasts, long_hair, large_breasts, bangs, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 75 | 80.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larchel_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 75 | 52.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larchel_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 156 | 101.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larchel_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 75 | 73.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larchel_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 156 | 135.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/larchel_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/larchel_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, open_mouth, white_gloves, armor, dress, elbow_gloves, ponytail, :d, cape, circlet |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, flower, solo, thighs, circlet, parted_bangs, parted_lips, highleg, looking_at_viewer, smile, swimsuit, ass, collarbone |
| 2 | 7 |  |  |  |  |  | 1girl, blush, circlet, collarbone, crop_top, looking_at_viewer, navel, parted_bangs, thighs, flower, smile, solo, white_shirt, bare_shoulders, cleavage, long_sleeves, tassel, closed_mouth, off-shoulder_shirt, white_panties, high-waist_pants, midriff, simple_background, tight_pants |
| 3 | 6 |  |  |  |  |  | 1girl, solo, coke-bottle_glasses, earrings, eyewear_on_head, gloves, looking_at_viewer, halloween_costume, alternate_costume, holding_lollipop, labcoat, ponytail, round-bottom_flask, smile, thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | open_mouth | white_gloves | armor | dress | elbow_gloves | ponytail | :d | cape | circlet | bare_shoulders | blush | flower | solo | thighs | parted_bangs | parted_lips | highleg | looking_at_viewer | smile | swimsuit | ass | collarbone | crop_top | navel | white_shirt | cleavage | long_sleeves | tassel | closed_mouth | off-shoulder_shirt | white_panties | high-waist_pants | midriff | simple_background | tight_pants | coke-bottle_glasses | earrings | eyewear_on_head | gloves | halloween_costume | alternate_costume | holding_lollipop | labcoat | round-bottom_flask | thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:---------------|:--------|:--------|:---------------|:-----------|:-----|:-------|:----------|:-----------------|:--------|:---------|:-------|:---------|:---------------|:--------------|:----------|:--------------------|:--------|:-----------|:------|:-------------|:-----------|:--------|:--------------|:-----------|:---------------|:---------|:---------------|:---------------------|:----------------|:-------------------|:----------|:--------------------|:--------------|:----------------------|:-----------|:------------------|:---------|:--------------------|:--------------------|:-------------------|:----------|:---------------------|:-------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | | | | | | | | | X | X | X | X | X | X | X | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | | | | X | | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
outeiral/VOZIA | ---
license: openrail
---
|
CyberHarem/akagi_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of akagi/赤城/赤城 (Kantai Collection)
This is the dataset of akagi/赤城/赤城 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, brown_hair, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 493.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akagi_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 323.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akagi_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1090 | 628.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akagi_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 451.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akagi_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1090 | 824.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/akagi_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/akagi_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, arrow_(projectile), bow_(weapon), muneate, solo, quiver, tasuki, white_thighhighs, yugake, hakama_short_skirt, single_glove, flight_deck, looking_at_viewer, white_background, smile |
| 1 | 7 |  |  |  |  |  | 1girl, arrow_(projectile), flight_deck, hakama_short_skirt, holding_bow_(weapon), muneate, quiver, single_glove, solo, straight_hair, tasuki, yugake, brown_gloves, red_hakama, white_background, looking_at_viewer, simple_background, smile, red_skirt, thighhighs |
| 2 | 6 |  |  |  |  |  | 1girl, arrow_(projectile), bow_(weapon), japanese_clothes, muneate, skirt, solo, white_thighhighs, yugake, smile, flight_deck, tasuki, white_background, looking_at_viewer, quiver |
| 3 | 10 |  |  |  |  |  | 1girl, aiming, drawing_bow, muneate, single_glove, solo, yugake, tasuki, kyuudou, outstretched_arm, holding_arrow, hakama_short_skirt, quiver, flight_deck |
| 4 | 5 |  |  |  |  |  | 1girl, hakama_short_skirt, looking_at_viewer, muneate, simple_background, smile, solo, straight_hair, tasuki, white_background, cowboy_shot, red_hakama, twitter_username |
| 5 | 6 |  |  |  |  |  | 1girl, japanese_clothes, muneate, simple_background, solo, tasuki, upper_body, white_background, looking_at_viewer, smile, blush |
| 6 | 15 |  |  |  |  |  | 1girl, japanese_clothes, muneate, solo, chopsticks, rice_on_face, eating, looking_at_viewer, rice_bowl, smile, blush |
| 7 | 6 |  |  |  |  |  | 1girl, food, looking_at_viewer, muneate, rice_bowl, solo, chopsticks, white_thighhighs, eating, hakama_skirt |
| 8 | 12 |  |  |  |  |  | 1girl, solo, alternate_costume, black_serafuku, looking_at_viewer, pleated_skirt, white_neckerchief, black_skirt, smile, straight_hair, white_background, black_sailor_collar, cowboy_shot, short_sleeves, simple_background, black_shirt |
| 9 | 6 |  |  |  |  |  | 1girl, alternate_costume, red_kimono, solo, blush, hair_flower, looking_at_viewer, floral_print, obi, smile, hair_between_eyes, open_mouth, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | arrow_(projectile) | bow_(weapon) | muneate | solo | quiver | tasuki | white_thighhighs | yugake | hakama_short_skirt | single_glove | flight_deck | looking_at_viewer | white_background | smile | holding_bow_(weapon) | straight_hair | brown_gloves | red_hakama | simple_background | red_skirt | thighhighs | japanese_clothes | skirt | aiming | drawing_bow | kyuudou | outstretched_arm | holding_arrow | cowboy_shot | twitter_username | upper_body | blush | chopsticks | rice_on_face | eating | rice_bowl | food | hakama_skirt | alternate_costume | black_serafuku | pleated_skirt | white_neckerchief | black_skirt | black_sailor_collar | short_sleeves | black_shirt | red_kimono | hair_flower | floral_print | obi | hair_between_eyes | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------------|:---------------|:----------|:-------|:---------|:---------|:-------------------|:---------|:---------------------|:---------------|:--------------|:--------------------|:-------------------|:--------|:-----------------------|:----------------|:---------------|:-------------|:--------------------|:------------|:-------------|:-------------------|:--------|:---------|:--------------|:----------|:-------------------|:----------------|:--------------|:-------------------|:-------------|:--------|:-------------|:---------------|:---------|:------------|:-------|:---------------|:--------------------|:-----------------|:----------------|:--------------------|:--------------|:----------------------|:----------------|:--------------|:-------------|:--------------|:---------------|:------|:--------------------|:-------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | X | X | X | X | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | | X | X | X | X | | X | X | X | X | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | X | | X | | | X | | | X | X | X | | X | | X | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | | X | X | | X | | | | | | X | X | X | | | | | X | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | |
| 6 | 15 |  |  |  |  |  | X | | | X | X | | | | | | | | X | | X | | | | | | | | X | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | X | X | X | X | | | | | | | | | | | | | | |
| 8 | 12 |  |  |  |  |  | X | | | | X | | | | | | | | X | X | X | | X | | | X | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | |
| 9 | 6 |  |  |  |  |  | X | | | | X | | | | | | | | X | | X | | | | | | | | | | | | | | | | | X | X | | | | | | | X | | | | | | | | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct-v0.2_openchat-3.5-0106 | ---
pretty_name: Evaluation run of Eric111/Mistral-7B-Instruct-v0.2_openchat-3.5-0106
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Eric111/Mistral-7B-Instruct-v0.2_openchat-3.5-0106](https://huggingface.co/Eric111/Mistral-7B-Instruct-v0.2_openchat-3.5-0106)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct-v0.2_openchat-3.5-0106\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-07T10:42:52.356363](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct-v0.2_openchat-3.5-0106/blob/main/results_2024-03-07T10-42-52.356363.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6332641326135611,\n\
\ \"acc_stderr\": 0.03255277208732769,\n \"acc_norm\": 0.6363402762817489,\n\
\ \"acc_norm_stderr\": 0.0332042573641979,\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5888869957680887,\n\
\ \"mc2_stderr\": 0.015467603841641853\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.014258563880513778,\n\
\ \"acc_norm\": 0.6569965870307167,\n \"acc_norm_stderr\": 0.013872423223718164\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6448914558852819,\n\
\ \"acc_stderr\": 0.004775681871529862,\n \"acc_norm\": 0.8458474407488548,\n\
\ \"acc_norm_stderr\": 0.0036035695286784127\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567104,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567104\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562076,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562076\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830503,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830503\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"\
acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504515,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504515\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6965174129353234,\n\
\ \"acc_stderr\": 0.03251006816458618,\n \"acc_norm\": 0.6965174129353234,\n\
\ \"acc_norm_stderr\": 0.03251006816458618\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653693,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653693\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4222766217870257,\n\
\ \"mc1_stderr\": 0.017290733254248174,\n \"mc2\": 0.5888869957680887,\n\
\ \"mc2_stderr\": 0.015467603841641853\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235803\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5405610310841547,\n \
\ \"acc_stderr\": 0.013727093010429785\n }\n}\n```"
repo_url: https://huggingface.co/Eric111/Mistral-7B-Instruct-v0.2_openchat-3.5-0106
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|arc:challenge|25_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|gsm8k|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hellaswag|10_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T10-42-52.356363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-07T10-42-52.356363.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- '**/details_harness|winogrande|5_2024-03-07T10-42-52.356363.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-07T10-42-52.356363.parquet'
- config_name: results
data_files:
- split: 2024_03_07T10_42_52.356363
path:
- results_2024-03-07T10-42-52.356363.parquet
- split: latest
path:
- results_2024-03-07T10-42-52.356363.parquet
---
# Dataset Card for Evaluation run of Eric111/Mistral-7B-Instruct-v0.2_openchat-3.5-0106
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eric111/Mistral-7B-Instruct-v0.2_openchat-3.5-0106](https://huggingface.co/Eric111/Mistral-7B-Instruct-v0.2_openchat-3.5-0106) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct-v0.2_openchat-3.5-0106",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-07T10:42:52.356363](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct-v0.2_openchat-3.5-0106/blob/main/results_2024-03-07T10-42-52.356363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6332641326135611,
"acc_stderr": 0.03255277208732769,
"acc_norm": 0.6363402762817489,
"acc_norm_stderr": 0.0332042573641979,
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5888869957680887,
"mc2_stderr": 0.015467603841641853
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.014258563880513778,
"acc_norm": 0.6569965870307167,
"acc_norm_stderr": 0.013872423223718164
},
"harness|hellaswag|10": {
"acc": 0.6448914558852819,
"acc_stderr": 0.004775681871529862,
"acc_norm": 0.8458474407488548,
"acc_norm_stderr": 0.0036035695286784127
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567104,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567104
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562076,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562076
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830503,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830503
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612896,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799215,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799215
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504515,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504515
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6965174129353234,
"acc_stderr": 0.03251006816458618,
"acc_norm": 0.6965174129353234,
"acc_norm_stderr": 0.03251006816458618
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653693,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653693
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4222766217870257,
"mc1_stderr": 0.017290733254248174,
"mc2": 0.5888869957680887,
"mc2_stderr": 0.015467603841641853
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235803
},
"harness|gsm8k|5": {
"acc": 0.5405610310841547,
"acc_stderr": 0.013727093010429785
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
bigscience-data/roots_indic-or_wiktionary | ---
language: or
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
|
Utkarsh55/utkarsh-llama2-profiles | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 76234
num_examples: 39
download_size: 40947
dataset_size: 76234
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Francesco/cable-damage | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': cable-damage
'1': break
'2': thunderbolt
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: cable-damage
tags:
- rf100
---
# Dataset Card for cable-damage
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/cable-damage
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
cable-damage
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/cable-damage
### Citation Information
```
@misc{ cable-damage,
title = { cable damage Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/cable-damage } },
url = { https://universe.roboflow.com/object-detection/cable-damage },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
p1atdev/fake-news-jp | ---
license: cc-by-2.5
language:
- ja
size_categories:
- 10K<n<100K
---
# 日本語フェイクニュースデータセット
[日本語フェイクニュースデータセット](https://github.com/tanreinama/Japanese-Fakenews-Dataset) を HuggingFace datasets 用に変換。
## ラベル
- id: 一意なID
- context: 本文
- fake_type: 真実なら `real`、途中からAI生成(GPT-2) なら `partial_gpt2`、すべて GPT-2 なら `full_gpt2`
- nchar_real: 真実部分の文字数
- nchar_fake: フェイク部分の文字数
|
mtc/multirc_sample_questions | ---
dataset_info:
features:
- name: document
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 75668
num_examples: 222
download_size: 38253
dataset_size: 75668
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.