datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Foxes/image | ---
license: other
---
|
open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1 | ---
pretty_name: Evaluation run of KaeriJenti/kaori-70b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KaeriJenti/kaori-70b-v1](https://huggingface.co/KaeriJenti/kaori-70b-v1) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-08T14:48:24.732982](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1/blob/main/results_2023-12-08T14-48-24.732982.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7060605188404532,\n\
\ \"acc_stderr\": 0.03018324713174517,\n \"acc_norm\": 0.710861868068879,\n\
\ \"acc_norm_stderr\": 0.030768133121540496,\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5881075564330039,\n\
\ \"mc2_stderr\": 0.014883748546839335\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145675,\n\
\ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6815375423222466,\n\
\ \"acc_stderr\": 0.004649278153073816,\n \"acc_norm\": 0.8736307508464449,\n\
\ \"acc_norm_stderr\": 0.0033158599188575543\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7320754716981132,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.7320754716981132,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6936170212765957,\n \"acc_stderr\": 0.03013590647851756,\n\
\ \"acc_norm\": 0.6936170212765957,\n \"acc_norm_stderr\": 0.03013590647851756\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.46825396825396826,\n \"acc_stderr\": 0.0256993528321318,\n \"\
acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.0256993528321318\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8387096774193549,\n\
\ \"acc_stderr\": 0.0209233270064233,\n \"acc_norm\": 0.8387096774193549,\n\
\ \"acc_norm_stderr\": 0.0209233270064233\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\"\
: 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781668,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781668\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.02193804773885312,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.02193804773885312\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.0180883938390789,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.0180883938390789\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7102564102564103,\n \"acc_stderr\": 0.023000628243687975,\n\
\ \"acc_norm\": 0.7102564102564103,\n \"acc_norm_stderr\": 0.023000628243687975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279476,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279476\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4768211920529801,\n \"acc_stderr\": 0.04078093859163083,\n \"\
acc_norm\": 0.4768211920529801,\n \"acc_norm_stderr\": 0.04078093859163083\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8972477064220183,\n \"acc_stderr\": 0.01301824650917377,\n \"\
acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.01301824650917377\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6435185185185185,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.6435185185185185,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426987,\n \"\
acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426987\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7713004484304933,\n\
\ \"acc_stderr\": 0.028188240046929203,\n \"acc_norm\": 0.7713004484304933,\n\
\ \"acc_norm_stderr\": 0.028188240046929203\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n\
\ \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.030922788320445784,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.030922788320445784\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.022289638852617887,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.022289638852617887\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.535195530726257,\n\
\ \"acc_stderr\": 0.01668102093107665,\n \"acc_norm\": 0.535195530726257,\n\
\ \"acc_norm_stderr\": 0.01668102093107665\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.02347558141786111,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.02347558141786111\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8302469135802469,\n \"acc_stderr\": 0.02088869041409387,\n\
\ \"acc_norm\": 0.8302469135802469,\n \"acc_norm_stderr\": 0.02088869041409387\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5780141843971631,\n \"acc_stderr\": 0.029462189233370586,\n \
\ \"acc_norm\": 0.5780141843971631,\n \"acc_norm_stderr\": 0.029462189233370586\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5619295958279009,\n\
\ \"acc_stderr\": 0.012671902782567638,\n \"acc_norm\": 0.5619295958279009,\n\
\ \"acc_norm_stderr\": 0.012671902782567638\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7610294117647058,\n \"acc_stderr\": 0.02590528064489301,\n\
\ \"acc_norm\": 0.7610294117647058,\n \"acc_norm_stderr\": 0.02590528064489301\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.01740181671142765,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.01740181671142765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5881075564330039,\n\
\ \"mc2_stderr\": 0.014883748546839335\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.840568271507498,\n \"acc_stderr\": 0.010288617479454764\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5238817285822593,\n \
\ \"acc_stderr\": 0.013756765835465755\n }\n}\n```"
repo_url: https://huggingface.co/KaeriJenti/kaori-70b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|arc:challenge|25_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|gsm8k|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hellaswag|10_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-08T14-48-24.732982.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- '**/details_harness|winogrande|5_2023-12-08T14-48-24.732982.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-08T14-48-24.732982.parquet'
- config_name: results
data_files:
- split: 2023_12_08T14_48_24.732982
path:
- results_2023-12-08T14-48-24.732982.parquet
- split: latest
path:
- results_2023-12-08T14-48-24.732982.parquet
---
# Dataset Card for Evaluation run of KaeriJenti/kaori-70b-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KaeriJenti/kaori-70b-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KaeriJenti/kaori-70b-v1](https://huggingface.co/KaeriJenti/kaori-70b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-08T14:48:24.732982](https://huggingface.co/datasets/open-llm-leaderboard/details_KaeriJenti__kaori-70b-v1/blob/main/results_2023-12-08T14-48-24.732982.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7060605188404532,
"acc_stderr": 0.03018324713174517,
"acc_norm": 0.710861868068879,
"acc_norm_stderr": 0.030768133121540496,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5881075564330039,
"mc2_stderr": 0.014883748546839335
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145675,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716413
},
"harness|hellaswag|10": {
"acc": 0.6815375423222466,
"acc_stderr": 0.004649278153073816,
"acc_norm": 0.8736307508464449,
"acc_norm_stderr": 0.0033158599188575543
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7320754716981132,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.7320754716981132,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6936170212765957,
"acc_stderr": 0.03013590647851756,
"acc_norm": 0.6936170212765957,
"acc_norm_stderr": 0.03013590647851756
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.0256993528321318,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.0256993528321318
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8387096774193549,
"acc_stderr": 0.0209233270064233,
"acc_norm": 0.8387096774193549,
"acc_norm_stderr": 0.0209233270064233
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781668,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781668
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.02193804773885312,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.02193804773885312
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.0180883938390789,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.0180883938390789
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7102564102564103,
"acc_stderr": 0.023000628243687975,
"acc_norm": 0.7102564102564103,
"acc_norm_stderr": 0.023000628243687975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279476,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279476
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4768211920529801,
"acc_stderr": 0.04078093859163083,
"acc_norm": 0.4768211920529801,
"acc_norm_stderr": 0.04078093859163083
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.01301824650917377,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.01301824650917377
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6435185185185185,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.6435185185185185,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426987,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426987
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7713004484304933,
"acc_stderr": 0.028188240046929203,
"acc_norm": 0.7713004484304933,
"acc_norm_stderr": 0.028188240046929203
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.030922788320445784,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.030922788320445784
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.022289638852617887,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.022289638852617887
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.535195530726257,
"acc_stderr": 0.01668102093107665,
"acc_norm": 0.535195530726257,
"acc_norm_stderr": 0.01668102093107665
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.02347558141786111,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.02347558141786111
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8302469135802469,
"acc_stderr": 0.02088869041409387,
"acc_norm": 0.8302469135802469,
"acc_norm_stderr": 0.02088869041409387
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5780141843971631,
"acc_stderr": 0.029462189233370586,
"acc_norm": 0.5780141843971631,
"acc_norm_stderr": 0.029462189233370586
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5619295958279009,
"acc_stderr": 0.012671902782567638,
"acc_norm": 0.5619295958279009,
"acc_norm_stderr": 0.012671902782567638
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7610294117647058,
"acc_stderr": 0.02590528064489301,
"acc_norm": 0.7610294117647058,
"acc_norm_stderr": 0.02590528064489301
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5881075564330039,
"mc2_stderr": 0.014883748546839335
},
"harness|winogrande|5": {
"acc": 0.840568271507498,
"acc_stderr": 0.010288617479454764
},
"harness|gsm8k|5": {
"acc": 0.5238817285822593,
"acc_stderr": 0.013756765835465755
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_louisbrulenaudet__Maxine-7B-0401-ties | ---
pretty_name: Evaluation run of louisbrulenaudet/Maxine-7B-0401-ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [louisbrulenaudet/Maxine-7B-0401-ties](https://huggingface.co/louisbrulenaudet/Maxine-7B-0401-ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_louisbrulenaudet__Maxine-7B-0401-ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-02T19:42:27.784543](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Maxine-7B-0401-ties/blob/main/results_2024-04-02T19-42-27.784543.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499764898890839,\n\
\ \"acc_stderr\": 0.03211096827722946,\n \"acc_norm\": 0.6488874998799237,\n\
\ \"acc_norm_stderr\": 0.032785672680969914,\n \"mc1\": 0.6009791921664627,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7451469185274574,\n\
\ \"mc2_stderr\": 0.01432046447721798\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6919795221843004,\n \"acc_stderr\": 0.013491429517292037,\n\
\ \"acc_norm\": 0.7175767918088737,\n \"acc_norm_stderr\": 0.013155456884097225\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7153953395737901,\n\
\ \"acc_stderr\": 0.004503037601847085,\n \"acc_norm\": 0.8883688508265286,\n\
\ \"acc_norm_stderr\": 0.0031426851645672597\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181015,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181015\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163227,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163227\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621133,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621133\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n\
\ \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n\
\ \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6009791921664627,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.7451469185274574,\n\
\ \"mc2_stderr\": 0.01432046447721798\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8326756116811366,\n \"acc_stderr\": 0.010490608806828077\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.730098559514784,\n \
\ \"acc_stderr\": 0.012227442856468897\n }\n}\n```"
repo_url: https://huggingface.co/louisbrulenaudet/Maxine-7B-0401-ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|arc:challenge|25_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|gsm8k|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hellaswag|10_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T19-42-27.784543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-02T19-42-27.784543.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- '**/details_harness|winogrande|5_2024-04-02T19-42-27.784543.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-02T19-42-27.784543.parquet'
- config_name: results
data_files:
- split: 2024_04_02T19_42_27.784543
path:
- results_2024-04-02T19-42-27.784543.parquet
- split: latest
path:
- results_2024-04-02T19-42-27.784543.parquet
---
# Dataset Card for Evaluation run of louisbrulenaudet/Maxine-7B-0401-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [louisbrulenaudet/Maxine-7B-0401-ties](https://huggingface.co/louisbrulenaudet/Maxine-7B-0401-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_louisbrulenaudet__Maxine-7B-0401-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-02T19:42:27.784543](https://huggingface.co/datasets/open-llm-leaderboard/details_louisbrulenaudet__Maxine-7B-0401-ties/blob/main/results_2024-04-02T19-42-27.784543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499764898890839,
"acc_stderr": 0.03211096827722946,
"acc_norm": 0.6488874998799237,
"acc_norm_stderr": 0.032785672680969914,
"mc1": 0.6009791921664627,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.7451469185274574,
"mc2_stderr": 0.01432046447721798
},
"harness|arc:challenge|25": {
"acc": 0.6919795221843004,
"acc_stderr": 0.013491429517292037,
"acc_norm": 0.7175767918088737,
"acc_norm_stderr": 0.013155456884097225
},
"harness|hellaswag|10": {
"acc": 0.7153953395737901,
"acc_stderr": 0.004503037601847085,
"acc_norm": 0.8883688508265286,
"acc_norm_stderr": 0.0031426851645672597
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337124,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337124
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181015,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181015
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163227,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163227
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621133,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621133
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532067,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6009791921664627,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.7451469185274574,
"mc2_stderr": 0.01432046447721798
},
"harness|winogrande|5": {
"acc": 0.8326756116811366,
"acc_stderr": 0.010490608806828077
},
"harness|gsm8k|5": {
"acc": 0.730098559514784,
"acc_stderr": 0.012227442856468897
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
d0rj/audiocaps | ---
dataset_info:
features:
- name: audiocap_id
dtype: int64
- name: youtube_id
dtype: string
- name: start_time
dtype: int64
- name: caption
dtype: string
splits:
- name: train
num_bytes: 4162928
num_examples: 49838
- name: validation
num_bytes: 198563
num_examples: 2475
- name: test
num_bytes: 454652
num_examples: 4875
download_size: 2781679
dataset_size: 4816143
license: mit
task_categories:
- text-to-speech
language:
- en
multilinguality:
- monolingual
tags:
- youtube
- captions
pretty_name: AudioCaps
size_categories:
- 10K<n<100K
source_datasets:
- original
paperswithcode_id: audiocaps
---
# audiocaps
## Dataset Description
- **Homepage:** https://audiocaps.github.io/
- **Repository:** https://github.com/cdjkim/audiocaps
- **Paper:** [AudioCaps: Generating Captions for Audios in The Wild](https://aclanthology.org/N19-1011.pdf)
HuggingFace mirror of [official data repo](https://github.com/cdjkim/audiocaps). |
bjoernp/wikipedia_sentence_level_en_de | ---
dataset_info:
features:
- name: sentences
dtype: string
- name: de_sentences
dtype: string
splits:
- name: train
num_bytes: 18768712971
num_examples: 27736968
download_size: 11340576833
dataset_size: 18768712971
---
# Dataset Card for "wikipedia_sentence_level_en_de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/med_alpaca_standardized_cluster_75 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 24893948
num_examples: 2454
download_size: 7545979
dataset_size: 24893948
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_75"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ovior/twitter_dataset_1713172455 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2363827
num_examples: 6747
download_size: 1380441
dataset_size: 2363827
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SahilSN/Dataset_v6 | ---
license: llama2
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 108497
num_examples: 503
download_size: 13792
dataset_size: 108497
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
argilla/10k_prompts_top_SPIN_iter0 | ---
dataset_info:
features:
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 7703827.877729258
num_examples: 1648
- name: test
num_bytes: 860136.1222707423
num_examples: 184
download_size: 4615457
dataset_size: 8563964.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf | ---
pretty_name: Evaluation run of NousResearch/CodeLlama-34b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NousResearch/CodeLlama-34b-hf](https://huggingface.co/NousResearch/CodeLlama-34b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-19T02:30:16.105719](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf/blob/main/results_2023-10-19T02-30-16.105719.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n\
\ \"em_stderr\": 0.000405845113241773,\n \"f1\": 0.053129194630872456,\n\
\ \"f1_stderr\": 0.0012479911431073588,\n \"acc\": 0.47504507313126765,\n\
\ \"acc_stderr\": 0.011877427321344947\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001572986577181208,\n \"em_stderr\": 0.000405845113241773,\n\
\ \"f1\": 0.053129194630872456,\n \"f1_stderr\": 0.0012479911431073588\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21607278241091737,\n \
\ \"acc_stderr\": 0.011336531489638852\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.734017363851618,\n \"acc_stderr\": 0.012418323153051043\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NousResearch/CodeLlama-34b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_19T02_30_16.105719
path:
- '**/details_harness|drop|3_2023-10-19T02-30-16.105719.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-19T02-30-16.105719.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_19T02_30_16.105719
path:
- '**/details_harness|gsm8k|5_2023-10-19T02-30-16.105719.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-19T02-30-16.105719.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:48:17.440962.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-26T04:48:17.440962.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_19T02_30_16.105719
path:
- '**/details_harness|winogrande|5_2023-10-19T02-30-16.105719.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-19T02-30-16.105719.parquet'
- config_name: results
data_files:
- split: 2023_08_26T04_48_17.440962
path:
- results_2023-08-26T04:48:17.440962.parquet
- split: 2023_10_19T02_30_16.105719
path:
- results_2023-10-19T02-30-16.105719.parquet
- split: latest
path:
- results_2023-10-19T02-30-16.105719.parquet
---
# Dataset Card for Evaluation run of NousResearch/CodeLlama-34b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NousResearch/CodeLlama-34b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NousResearch/CodeLlama-34b-hf](https://huggingface.co/NousResearch/CodeLlama-34b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T02:30:16.105719](https://huggingface.co/datasets/open-llm-leaderboard/details_NousResearch__CodeLlama-34b-hf/blob/main/results_2023-10-19T02-30-16.105719.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241773,
"f1": 0.053129194630872456,
"f1_stderr": 0.0012479911431073588,
"acc": 0.47504507313126765,
"acc_stderr": 0.011877427321344947
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.000405845113241773,
"f1": 0.053129194630872456,
"f1_stderr": 0.0012479911431073588
},
"harness|gsm8k|5": {
"acc": 0.21607278241091737,
"acc_stderr": 0.011336531489638852
},
"harness|winogrande|5": {
"acc": 0.734017363851618,
"acc_stderr": 0.012418323153051043
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ai4ce/CoVPR | ---
license: apache-2.0
---
|
JinglesDados/PretoJoia | ---
license: openrail
---
|
kaleemWaheed/twitter_dataset_1712991179 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 246797
num_examples: 645
download_size: 90767
dataset_size: 246797
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hippocrates/emrqaQA_medication_train | ---
dataset_info:
features:
- name: id
dtype: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 24849580
num_examples: 59928
- name: valid
num_bytes: 4286042
num_examples: 10468
download_size: 0
dataset_size: 29135622
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
# Dataset Card for "emrqaQA_medication_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rezaakb/VizWiz-Classification | ---
license: mit
---
|
mu-llama/MusicQA | ---
license: mit
---
# MusicQA Dataset
This is the dataset used for training and testing the Music Understanding Large Language Model (MU-LLaMA). |
TheGreatRambler/mm2_world | ---
language:
- multilingual
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- other
- object-detection
- text-retrieval
- token-classification
- text-generation
task_ids: []
pretty_name: Mario Maker 2 super worlds
tags:
- text-mining
---
# Mario Maker 2 super worlds
Part of the [Mario Maker 2 Dataset Collection](https://tgrcode.com/posts/mario_maker_2_datasets)
## Dataset Description
The Mario Maker 2 super worlds dataset consists of 289 thousand super worlds from Nintendo's online service totaling around 13.5GB of data. The dataset was created using the self-hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api) over the course of 1 month in February 2022.
### How to use it
The Mario Maker 2 super worlds dataset is a very large dataset so for most use cases it is recommended to make use of the streaming API of `datasets`. You can load and iterate through the dataset with the following code:
```python
from datasets import load_dataset
ds = load_dataset("TheGreatRambler/mm2_world", streaming=True, split="train")
print(next(iter(ds)))
#OUTPUT:
{
'pid': '14510618610706594411',
'world_id': 'c96012bef256ba6b_20200513204805563301',
'worlds': 1,
'levels': 5,
'planet_type': 0,
'created': 1589420886,
'unk1': [some binary data],
'unk5': 3,
'unk6': 1,
'unk7': 1,
'thumbnail': [some binary data]
}
```
Each row is a unique super world denoted by the `world_id` created by the player denoted by the `pid`. Thumbnails are binary PNGs. `unk1` describes the super world itself, including the world map, but its format is unknown as of now.
You can also download the full dataset. Note that this will download ~13.5GB:
```python
ds = load_dataset("TheGreatRambler/mm2_world", split="train")
```
## Data Structure
### Data Instances
```python
{
'pid': '14510618610706594411',
'world_id': 'c96012bef256ba6b_20200513204805563301',
'worlds': 1,
'levels': 5,
'planet_type': 0,
'created': 1589420886,
'unk1': [some binary data],
'unk5': 3,
'unk6': 1,
'unk7': 1,
'thumbnail': [some binary data]
}
```
### Data Fields
|Field|Type|Description|
|---|---|---|
|pid|string|The player ID of the user who created this super world|
|world_id|string|World ID|
|worlds|int|Number of worlds|
|levels|int|Number of levels|
|planet_type|int|Planet type, enum below|
|created|int|UTC timestamp of when this super world was created|
|unk1|bytes|Unknown|
|unk5|int|Unknown|
|unk6|int|Unknown|
|unk7|int|Unknown|
|thumbnail|bytes|The thumbnail, as a JPEG binary|
|thumbnail_url|string|The old URL of this thumbnail|
|thumbnail_size|int|The filesize of this thumbnail|
|thumbnail_filename|string|The filename of this thumbnail|
### Data Splits
The dataset only contains a train split.
## Enums
The dataset contains some enum integer fields. This can be used to convert back to their string equivalents:
```python
SuperWorldPlanetType = {
0: "Earth",
1: "Moon",
2: "Sand",
3: "Green",
4: "Ice",
5: "Ringed",
6: "Red",
7: "Spiral"
}
```
<!-- TODO create detailed statistics -->
## Dataset Creation
The dataset was created over a little more than a month in Febuary 2022 using the self hosted [Mario Maker 2 api](https://tgrcode.com/posts/mario_maker_2_api). As requests made to Nintendo's servers require authentication the process had to be done with upmost care and limiting download speed as to not overload the API and risk a ban. There are no intentions to create an updated release of this dataset.
## Considerations for Using the Data
The dataset consists of super worlds from many different Mario Maker 2 players globally and as such harmful depictions could be present in their super world thumbnails.
|
DynamicSuperb/ChordClassification_AcousticGuitarAndPiano | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
dtype: string
- name: instruction
dtype: string
splits:
- name: test
num_bytes: 39529783.64377183
num_examples: 200
download_size: 35272574
dataset_size: 39529783.64377183
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "chord_classification_acoustic_guitar_and_piano"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hoangdeeptry/tdtu_voice_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 4028544765.378
num_examples: 1523
download_size: 3753065079
dataset_size: 4028544765.378
---
# Dataset Card for "tdtu_voice_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mnli_finna_future | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 152442
num_examples: 648
- name: dev_mismatched
num_bytes: 156869
num_examples: 702
- name: test_matched
num_bytes: 136497
num_examples: 566
- name: test_mismatched
num_bytes: 139968
num_examples: 632
- name: train
num_bytes: 5520127
num_examples: 23152
download_size: 3669459
dataset_size: 6105903
---
# Dataset Card for "MULTI_VALUE_mnli_finna_future"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DBD-research-group/BirdSet | ---
task_categories:
- audio-classification
license: cc
tags:
- bird classification
- passive acoustic monitoring
---
## Dataset Description
- **Repository:** [https://github.com/DBD-research-group/GADME](https://github.com/DBD-research-group/BirdSet)
- **Paper:** [GADME](https://arxiv.org/abs/2403.10380)
- **Point of Contact:** [Lukas Rauch](mailto:lukas.rauch@uni-kassel.de)
### Datasets
We present the BirdSet benchmark that covers a comprehensive range of (multi-label and multi-class) classification datasets in avian bioacoustics.
We offer a static set of evaluation datasets and a varied collection of training datasets, enabling the application of diverse methodologies.
We have a complementary code base: https://github.com/DBD-research-group/BirdSet
and a complementary paper (work in progress): https://arxiv.org/abs/2403.10380
| | train | test | test_5s | size (GB) | #classes |
|--------------------------------|--------:|-----------:|--------:|-----------:|-------------:|
| [PER][1] (Amazon Basin) | 16,802 | 14,798 | 15,120 | 10.5 | 132 |
| [NES][2] (Colombia Costa Rica) | 16,117 | 6,952 | 24,480 | 14.2 | 89 |
| [UHH][3] (Hawaiian Islands) | 3,626 | 59,583 | 36,637 | 4.92 | 25 tr, 27 te |
| [HSN][4] (high_sierras) | 5,460 | 10,296 | 12,000 | 5.92 | 21 |
| [NBP][5] (NIPS4BPlus) | 24,327 | 5,493 | 563 | 29.9 | 51 |
| [POW][6] (Powdermill Nature) | 14,911 | 16,052 | 4,560 | 15.7 | 48 |
| [SSW][7] (Sapsucker Woods) | 28,403 | 50,760 | 205,200| 35.2 | 81 |
| [SNE][8] (Sierra Nevada) | 19,390 | 20,147 | 23,756 | 20.8 | 56 |
| [XCM][9] (Xenocanto Subset M) | 89,798 | x | x | 89.3 | 409 (411) |
| [XCL][10] (Xenocanto Complete) | 528,434| x | x | 484 | 9,735 |
[1]: https://zenodo.org/records/7079124
[2]: https://zenodo.org/records/7525349
[3]: https://zenodo.org/records/7078499
[4]: https://zenodo.org/records/7525805
[5]: https://github.com/fbravosanchez/NIPS4Bplus
[6]: https://zenodo.org/records/4656848
[7]: https://zenodo.org/records/7018484
[8]: https://zenodo.org/records/7050014
[9]: https://xeno-canto.org/
[10]: https://xeno-canto.org
- We assemble a training dataset for each test dataset that is a subset of a complete Xeno-Canto (XC) snapshot. We extract all recordings that have vocalizations of the bird species appearing in the test dataset.
- The focal training datasets or soundscape test datasets components can be individually accessed using the identifiers **NAME_xc** and **NAME_scape**, respectively (e.g., **HSN_xc** for the focal part and **HSN_scape** for the soundscape).
- We use the .ogg format for every recording and a sampling rate of 32 kHz.
- Each sample in the training dataset is a recording that may contain more than one vocalization of the corresponding bird species.
- Each recording in the training datasets has a unique recordist and the corresponding license from XC. We omit all recordings from XC that are CC-ND.
- The bird species are translated to ebird_codes
- Snapshot date of XC: 03/10/2024
**Train**
- Exclusively using focal audio data from XC with quality ratings A, B, C and excluding all recordings that are CC-ND.
- Each dataset is tailored for specific target species identified in the corresponding test soundscape files.
- We transform the scientific names of the birds into the corresponding ebird_code label.
- We offer detected events and corresponding cluster assignments to identify bird sounds in each recording.
- We provide the full recordings from XC. These can generate multiple samples from a single instance.
**Test_5s**
- Task: Multilabel ("ebird_code_multilabel")
- Only soundscape data from Zenodo formatted acoording to the Kaggle evaluation scheme.
- Each recording is segmented into 5-second intervals where each ground truth bird vocalization is assigned to.
- This contains segments without any labels which results in a [0] vector.
**Test**
- Task: Multiclass ("ebird_code")
- Only soundscape data sourced from Zenodo.
- We provide the full recording with the complete label set and specified bounding boxes.
- This dataset excludes recordings that do not contain bird calls ("no_call").
### Quick Use
- For multi-label evaluation with a segment-based evaluation use the test_5s column for testing.
- You could only load the first 5 seconds or a given event per recording to quickly create a training dataset.
- We recommend to start with HSN. It is a medium size dataset with a low number of overlaps within a segment
### Metadata
| | format | description |
|------------------------|-------------------------------------------------------:|-------------------------:|
| audio | Audio(sampling_rate=32_000, mono=True, decode=False) | audio object from hf |
| filepath | Value("string") | relative path where the recording is stored |
| start_time | Value("float64") | only testdata: start time of a vocalization in s |
| end_time | Value("float64") | only testdata: end time of a vocalzation in s |
| low_freq | Value("int64") | only testdata: low frequency bound for a vocalization in kHz |
| high_freq | Value("int64") | only testdata: high frequency bound for a vocalization in kHz |
| ebird_code | ClassLabel(names=class_list) | assigned species label |
| ebird_code_secondary | Sequence(datasets.Value("string")) | only traindata: possible secondary species in a recording |
| ebird_code_multilabel | Sequence(datasets.ClassLabel(names=class_list)) | assigned species label in a multilabel format |
| call_type | Sequence(datasets.Value("string")) | only traindata: type of bird vocalization |
| sex | Value("string") | only traindata: sex of bird species |
| lat | Value("float64") | latitude of vocalization/recording in WGS84 |
| long | Value("float64") | lontitude of vocalization/recording in WGS84 |
| length | Value("int64") | length of the file in s |
| microphone | Value("string") | soundscape or focal recording with the microphone string |
| license | Value("string") | license of the recording |
| source | Value("string") | source of the recording |
| local_time | Value("string") | local time of the recording |
| detected_events | Sequence(datasets.Sequence(datasets.Value("float64")))| only traindata: detected audio events in a recording with bambird, tuples of start/end time |
| event_cluster | Sequence(datasets.Value("int64")) | only traindata: detected audio events assigned to a cluster with bambird |
| peaks | Sequence(datasets.Value("float64")) | only traindata: peak event detected with scipy peak detection |
| quality | Value("string") | only traindata: recording quality of the recording (A,B,C) |
| recordist | Value("string") | only traindata: recordist of the recording |
#### Example Metadata Train
```python
{'audio': {'path': '.ogg',
'array': array([ 0.0008485 , 0.00128899, -0.00317163, ..., 0.00228528,
0.00270796, -0.00120562]),
'sampling_rate': 32000},
'filepath': '.ogg',
'start_time': None,
'end_time': None,
'low_freq': None,
'high_freq': None,
'ebird_code': 0,
'ebird_code_multilabel': [0],
'ebird_code_secondary': ['plaant1', 'blfnun1', 'butwoo1', 'whtdov', 'undtin1', 'gryhaw3'],
'call_type': 'song',
'sex': 'uncertain',
'lat': -16.0538,
'long': -49.604,
'length': 46,
'microphone': 'focal',
'license': '//creativecommons.org/licenses/by-nc-sa/4.0/',
'source': 'xenocanto',
'local_time': '18:37',
'detected_events': [[0.736, 1.824],
[9.936, 10.944],
[13.872, 15.552],
[19.552, 20.752],
[24.816, 25.968],
[26.528, 32.16],
[36.112, 37.808],
[37.792, 38.88],
[40.048, 40.8],
[44.432, 45.616]],
'event_cluster': [0, 0, 0, 0, 0, -1, 0, 0, -1, 0],
'peaks': [14.76479119037789, 41.16993396760847],
'quality': 'A',
'recordist': '...'}
```
#### Example Metadata Test5s
```python
{'audio': {'path': '.ogg',
'array': array([-0.67190468, -0.9638235 , -0.99569213, ..., -0.01262935,
-0.01533066, -0.0141047 ]),
'sampling_rate': 32000},
'filepath': '.ogg',
'start_time': 0.0,
'end_time': 5.0,
'low_freq': 0,
'high_freq': 3098,
'ebird_code': None,
'ebird_code_multilabel': [1, 10],
'ebird_code_secondary': None,
'call_type': None,
'sex': None,
'lat': 5.59,
'long': -75.85,
'length': None,
'microphone': 'Soundscape',
'license': 'Creative Commons Attribution 4.0 International Public License',
'source': 'https://zenodo.org/record/7525349',
'local_time': '4:30:29',
'detected_events': None,
'event_cluster': None,
'peaks': None,
'quality': None,
'recordist': None}
```
### Citation Information
```
@misc{birdset,
title={BirdSet: A Multi-Task Benchmark for Classification in Avian Bioacoustics},
author={Lukas Rauch and Raphael Schwinger and Moritz Wirth and René Heinrich and Jonas Lange and Stefan Kahl and Bernhard Sick and Sven Tomforde and Christoph Scholz},
year={2024},
eprint={2403.10380},
archivePrefix={arXiv},
primaryClass={cs.SD}
}
Note that each test subset in the BirdSet dataset has its own citation. Please see the source to see
the correct citation for each contained dataset. Each file in the training dataset also has its own recordist noted. The licenses can be found in the metadata.
``` |
mrachilles/NTU60PointsDescription | ---
license: mit
---
|
etav22/congress_117_bills_pegasus_summaries | ---
dataset_info:
features:
- name: index
dtype: int64
- name: policy_areas
dtype: string
- name: cur_text
dtype: string
- name: cleaned_summary
dtype: string
- name: generated_summary
dtype: string
splits:
- name: test
num_bytes: 13039601
num_examples: 377
download_size: 4729722
dataset_size: 13039601
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
lshowway/wikipedia.reorder.OVS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4084815753
num_examples: 2312333
download_size: 2006658115
dataset_size: 4084815753
---
# Dataset Card for "wikipedia.reorder.OVS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yuancj/character-profiles-romance-prepare2 | ---
dataset_info:
features:
- name: name
dtype: string
- name: categories
sequence: string
- name: personalities
sequence: string
- name: description
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 20485
num_examples: 2
download_size: 30036
dataset_size: 20485
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "character-profiles-romance-prepare2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aav-ds/Israel-HAMAS_war_news | ---
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- text-classification
- text-generation
pretty_name: Israel-HAMAS war news
dataset_info:
features:
- name: url
dtype: string
- name: datetime
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: provider
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 14269346
num_examples: 13103
download_size: 6081837
dataset_size: 14269346
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Israel-HAMAS war news
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
## Dataset Description
**Point of Contact:** [Alexander Akhterov](mailto:a.ahterov@gmail.com)
### Dataset Summary
The "Israel-HAMAS war news" dataset is an English-language dataset of news about Israel war against the terrorist organization -
HAMAS that happened after "black Saturday" - massive murders of civilian Israeli people on the 7th of October 2023.
We've accumulated news from the following sources:
- BBC (live news) - from 2023-11-05 to 2023-11-18. Total: 805
- The Times of Israel (live news) - from 2023-10-07 to 2023-11-18. Total: 6581
- Al Jazeera (live news) - from 2023-11-04 to 2023-11-25. Total: 3297
- Al Mayadeen (articles from the site) - from 2023-10-08 to 2023-11-24. Total: 74
- WAFA "News" Agency (articles from the site section "Occupation")- from 2023-09-28 to 2023-11-26. Total: 1020
- CNN live news - from 2023-10-26 to 2023-11-26. Total: 1428
All news is collected using Beautiful Soup Python library. In the case of the BBC site, the WAFA site, and the Al Mayadeen site
it was enough to use simple GET requests and in the other cases we used Silenium.
### Supported Tasks and Leaderboards
`sentiment-classification`, `semantic-similarity-classification`: The idea behind the dataset is to fine-tune one of the LLMs
to make a news sentiment analysis regarding the pro- and anti-Israel attitudes (but the collected dataset can be used also
for other NLP tasks). One of the main issue is data labeling. To overcome it we suppose that almost all news from the Palestinian
"news" agency "WAFA" and Lebanese agency "Al Mayadeen" has anti-Israel position. Most of the Al Jazeera news also tends to be against
Israel. In return news by "The Times of Israel" is mostly pro-Israel.
For example, the following piece of news carries an anti-Israel pattern:
> KHAN YUNIS, Sunday, December 10, 2023 (WAFA) - At least 10 civilians were killed, mostly children, and dozens more were wounded
> early this morning as Israeli warplanes bombed a residential house in Khan Yunis, south of the Gaza Strip, as the Israeli
> aggression on the enclave enters its 65th day in a row. (WAFA "news" agency).
On the contrary, the following information from "The Times of Israel" is pro-Israel:
> Several thousand people demonstrate against antisemitism in Berlin as Germany grapples with a large increase in anti-Jewish
> incidents following Hamas’s assault on Israel two months ago. Police estimate that around 3,200 people gathered in the rain
> in the German capital, while organizers put the figure at 10,000, German news agency dpa reports. Participants in the protest,
> titled “Never again is now,” march to the Brandenburg Gate.
### Languages
The text in the dataset is in English since most news is in English. The associated BCP-47 code is en.
## Dataset Structure
### Data Instances
```
{'url': 'https://www.timesofisrael.com/liveblog_entry/man-arrested-in-death-of-jewish-protester-during-dueling-california-rallies-over-war/',
'datetime': '2023-11-16T18:36:15',
'title': 'Man arrested in death of Jewish protester during dueling California rallies over war',
'text': 'California authorities say they have arrested a man in connection with the death of a Jewish protester during demonstrations over the Israel-Hamas war.\nThe Ventura County Sheriff’s Office says the 50-year-old suspect was arrested today and will be booked into jail in the investigation of involuntary manslaughter — the unintentional killing of another person. The district attorney will decide whether there is enough evidence to bring a formal charge.\nPaul Kessler, 69, died early November 6 at a hospital following a November 5 confrontation with a pro-Palestinian demonstrator in Thousand Oaks, a suburb northwest of Los Angeles.\nSheriff Jim Fryhoff said subsequently that deputies determined Kessler had fallen backward and struck his head on the ground. The pro-Palestinian demonstrator stayed at the scene and told deputies he had called 911, Fryhoff said.',
'provider': 'The Times of Israel',
'source': 'site-live-news'}
```
### Data Fields
- "url" - link to the piece of news;
- "datetime" - news date and time (YYYY-mm-ddTHH:MM:SS);
- "title" - news title;
- "text" - news text;
- "provider" - news providers, can be 'BBC', 'The Times of Israel', 'Al Jazeera', 'Al Mayadeen', 'WAFA News Agency', and 'CNN';
- "source" - where news was collected.
## Dataset Creation
### Curation Rationale
The "Israel-HAMAS war news" dataset was built as a solution to the data annotation problem for the sentiment analysis task
regarding the pro- and anti-Israel attitudes. We've supposed that almost all news from the Palestinian "news" agency "WAFA",
Lebanese agency "Al Mayadeen", and Qatari "Al Jazeera" has anti-Israel position. In return news by "The Times of Israel"
is mostly pro-Israel.
### Source Data
The official sites of the news agencies are the sources of the dataset:
- The BBC - https://www.bbc.com/
- The Times of Israel - https://www.timesofisrael.com/
- Al Jazeera - https://www.aljazeera.com/
- Al Mayadeen - https://english.almayadeen.net/
- WAFA News Agency - https://english.wafa.ps/
- CNN - https://edition.cnn.com/
-
#### Initial Data Collection and Normalization
The data was obtained by web scraping of the appropriate sections of the news agencies' official sites.
### Annotations
The dataset does not contain any additional annotations.
### Personal and Sensitive Information
The news authors were removed.
|
amcoff/skolmat | ---
annotations_creators:
- expert-generated
language:
- sv
language_creators:
- found
license:
- mit
multilinguality:
- monolingual
pretty_name: Skolmat
size_categories: []
source_datasets:
- original
tags: []
task_categories:
- text-classification
task_ids: []
---
# Dataset Card for Skolmat
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
Tensoic/ClosedDomainQA_hn | ---
dataset_info:
features:
- name: input
dtype: string
- name: response
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 66750160
num_examples: 20508
download_size: 18085113
dataset_size: 66750160
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/southampton_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of southampton/サウサンプトン/南安普顿 (Azur Lane)
This is the dataset of southampton/サウサンプトン/南安普顿 (Azur Lane), containing 44 images and their tags.
The core tags of this character are `blonde_hair, green_eyes, ahoge, short_hair, bangs, hair_ornament, hairband, ribbon, breasts, red_ribbon, hair_ribbon, musical_note_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 44 | 54.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/southampton_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 44 | 31.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/southampton_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 96 | 60.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/southampton_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 44 | 46.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/southampton_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 96 | 82.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/southampton_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/southampton_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, maid_headdress, solo, black_dress, looking_at_viewer, wrist_cuffs, maid_apron, open_mouth, puffy_short_sleeves, white_apron, black_footwear, frilled_apron, smile, cleavage, frilled_dress, garter_straps, standing, white_thighhighs, x_hair_ornament, back_bow, enmaided, full_body, high_heels, holding_instrument, medium_breasts, musical_note, violin, waist_apron, white_background |
| 1 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, white_thighhighs, black_gloves, black_skirt, long_sleeves, pleated_skirt, red_headwear, garter_straps, musical_note, simple_background, white_background, boots, epaulettes, full_body, holding, red_hairband, red_jacket, hair_between_eyes, jewelry, red_footwear, ascot, blush, miniskirt, treble_clef, white_shirt, x_hair_ornament, beret, one_eye_closed, open_mouth, standing |
| 2 | 9 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_thighhighs, black_gloves, detached_sleeves, open_mouth, pleated_skirt, red_skirt, wide_sleeves, frills, long_sleeves, obi, ribbon_trim, white_kimono, bow, full_body, bare_shoulders, blush, fur_collar, paper_lantern, standing, zettai_ryouiki, :o, arm_up, black_ribbon, hand_up, holding, torii, zouri |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | maid_headdress | solo | black_dress | looking_at_viewer | wrist_cuffs | maid_apron | open_mouth | puffy_short_sleeves | white_apron | black_footwear | frilled_apron | smile | cleavage | frilled_dress | garter_straps | standing | white_thighhighs | x_hair_ornament | back_bow | enmaided | full_body | high_heels | holding_instrument | medium_breasts | musical_note | violin | waist_apron | white_background | black_gloves | black_skirt | long_sleeves | pleated_skirt | red_headwear | simple_background | boots | epaulettes | holding | red_hairband | red_jacket | hair_between_eyes | jewelry | red_footwear | ascot | blush | miniskirt | treble_clef | white_shirt | beret | one_eye_closed | detached_sleeves | red_skirt | wide_sleeves | frills | obi | ribbon_trim | white_kimono | bow | bare_shoulders | fur_collar | paper_lantern | zettai_ryouiki | :o | arm_up | black_ribbon | hand_up | torii | zouri |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:-------|:--------------|:--------------------|:--------------|:-------------|:-------------|:----------------------|:--------------|:-----------------|:----------------|:--------|:-----------|:----------------|:----------------|:-----------|:-------------------|:------------------|:-----------|:-----------|:------------|:-------------|:---------------------|:-----------------|:---------------|:---------|:--------------|:-------------------|:---------------|:--------------|:---------------|:----------------|:---------------|:--------------------|:--------|:-------------|:----------|:---------------|:-------------|:--------------------|:----------|:---------------|:--------|:--------|:------------|:--------------|:--------------|:--------|:-----------------|:-------------------|:------------|:---------------|:---------|:------|:--------------|:---------------|:------|:-----------------|:-------------|:----------------|:-----------------|:-----|:---------|:---------------|:----------|:--------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | | X | | X | | | X | | | | | X | | | X | X | X | X | | | X | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | X | X | | | | X | | | | | | | | X | | X | X | | | | | X | | | | | | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/mary_cochran_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of mary_cochran/メアリー・コクラン (THE iDOLM@STER: Cinderella Girls)
This is the dataset of mary_cochran/メアリー・コクラン (THE iDOLM@STER: Cinderella Girls), containing 84 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, bow, bangs, hair_bow, green_eyes, aqua_eyes, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 84 | 60.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_cochran_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 84 | 49.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_cochran_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 152 | 86.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_cochran_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 84 | 58.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_cochran_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 152 | 100.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_cochran_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mary_cochran_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, solo, character_name, star_(symbol), sun_symbol, card_parody, hair_bobbles, innertube, one-piece_swimsuit, open_mouth, school_swimsuit, smile |
| 1 | 10 |  |  |  |  |  | 1girl, smile, solo, dress, looking_at_viewer, open_mouth, short_sleeves, skirt, blush, earrings, one_eye_closed, bracelet, striped, thighhighs |
| 2 | 6 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, open_mouth, :d, close-up |
| 3 | 5 |  |  |  |  |  | 1girl, navel, solo, micro_bikini, smile, blue_eyes, blush, flat_chest, heart, looking_at_viewer, polka_dot_bow, side-tie_bikini_bottom, simple_background, white_background, american_flag_bikini, cowboy_shot, hand_on_hip, hand_up, one_eye_closed, small_breasts, thigh_strap, white_bow |
| 4 | 6 |  |  |  |  |  | 1girl, elbow_gloves, looking_at_viewer, midriff, black_gloves, flag, navel, solo, wrist_cuffs, band_uniform, plaid_skirt, union_jack, aiguillette, blue_eyes, boots, crop_top, epaulettes, shako_cap, smile, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | character_name | star_(symbol) | sun_symbol | card_parody | hair_bobbles | innertube | one-piece_swimsuit | open_mouth | school_swimsuit | smile | dress | looking_at_viewer | short_sleeves | skirt | blush | earrings | one_eye_closed | bracelet | striped | thighhighs | :d | close-up | navel | micro_bikini | blue_eyes | flat_chest | heart | polka_dot_bow | side-tie_bikini_bottom | simple_background | white_background | american_flag_bikini | cowboy_shot | hand_on_hip | hand_up | small_breasts | thigh_strap | white_bow | elbow_gloves | midriff | black_gloves | flag | wrist_cuffs | band_uniform | plaid_skirt | union_jack | aiguillette | boots | crop_top | epaulettes | shako_cap | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:----------------|:-------------|:--------------|:---------------|:------------|:---------------------|:-------------|:------------------|:--------|:--------|:--------------------|:----------------|:--------|:--------|:-----------|:-----------------|:-----------|:----------|:-------------|:-----|:-----------|:--------|:---------------|:------------|:-------------|:--------|:----------------|:-------------------------|:--------------------|:-------------------|:-----------------------|:--------------|:--------------|:----------|:----------------|:--------------|:------------|:---------------|:----------|:---------------|:-------|:--------------|:---------------|:--------------|:-------------|:--------------|:--------|:-----------|:-------------|:------------|:-------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | | | | | | | X | | | | X | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | | | | | | | | X | | X | | | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | | | | | | | | | X | | X | | | | | | | | | | | X | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
bigbio/iepa |
---
language:
- en
bigbio_language:
- English
license: unknown
multilinguality: monolingual
bigbio_license_shortname: UNKNOWN
pretty_name: IEPA
homepage: http://psb.stanford.edu/psb-online/proceedings/psb02/abstracts/p326.html
bigbio_pubmed: True
bigbio_public: True
bigbio_tasks:
- RELATION_EXTRACTION
---
# Dataset Card for IEPA
## Dataset Description
- **Homepage:** http://psb.stanford.edu/psb-online/proceedings/psb02/abstracts/p326.html
- **Pubmed:** True
- **Public:** True
- **Tasks:** RE
The IEPA benchmark PPI corpus is designed for relation extraction. It was created from 303 PubMed abstracts, each of which contains a specific pair of co-occurring chemicals.
## Citation Information
```
@ARTICLE{ding2001mining,
title = "Mining {MEDLINE}: abstracts, sentences, or phrases?",
author = "Ding, J and Berleant, D and Nettleton, D and Wurtele, E",
journal = "Pac Symp Biocomput",
pages = "326--337",
year = 2002,
address = "United States",
language = "en"
}
```
|
stoddur/med_chat_8 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 173561040.0
num_examples: 112410
download_size: 4062144
dataset_size: 173561040.0
---
# Dataset Card for "med_chat_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
m-ric/Open_Assistant_Conversation_Chains | ---
license: apache-2.0
language:
- en
- es
- ru
- de
- pl
- th
- vi
- sv
- bn
- da
- he
- it
- fa
- sk
- id
- nb
- el
- nl
- hu
- eu
- zh
- eo
- ja
- ca
- cs
- bg
- fi
- pt
- tr
- ro
- ar
- uk
- gl
- fr
- ko
task_categories:
- conversational
- text-generation
tags:
- human-feedback
size_categories:
- 10K<n<100K
pretty_name: OpenAssistant Conversations Unrolled
---
# Dataset Card for Dataset Name
## Dataset description
<!-- Provide a quick summary of the dataset. -->
This dataset is a reformatting of [OpenAssistant Conversations (OASST1)](https://huggingface.co/datasets/OpenAssistant/oasst1), which is
> a human-generated, human-annotated assistant-style conversation corpus consisting of 161,443 messages in 35 different languages, annotated with 461,292 quality ratings, resulting in over 10,000 fully annotated conversation trees. The corpus is a product of a worldwide crowd-sourcing effort involving over 13,500 volunteers.
It was modified from the original by following the tree branches and reforming the messages into conversation streams.
#### Who are the editors?
Aymeric Roucher, Hugging Face.
### Example
````
{
'conversation_id': '6708c47f-05c9-4346-b3d2-40b2bd24fde4',
'user_id': '2c96e467-66f0-4be7-9693-bda51356a424',
'created_date': '2023-02-06T18:48:49.391686+00:00',
'messages': [
{'content': 'Can you write a short introduction about the relevance of the term "monopsony" in economics? Please use examples related to potential monopsonies in the labour market and cite relevant research.',
'role': 'user'},
{'content': '"Monopsony" refers to a market structure where there is only one buyer for a particular good or service. In economics, this term is particularly relevant in the labor market, where a monopsony employer has significant power over the wages and working conditions of their employees. The presence of a monopsony can result in lower wages and reduced employment opportunities for workers, as the employer has little incentive to increase wages or provide better working conditions.\n\nRecent research has identified potential monopsonies in industries such as retail and fast food, where a few large companies control a significant portion of the market (Bivens & Mishel, 2013). In these industries, workers often face low wages, limited benefits, and reduced bargaining power, leading to a situation where they are dependent on the employer for their livelihood. This dependence can result in further suppression of wages and a decline in working conditions.\n\nOverall, the concept of monopsony is essential to understanding the dynamics of labor markets and the impact of market power on workers. Further research is needed to understand the extent and impact of monopsonies on the economy and to develop policies to address this issue.\n\nReferences:\nBivens, J., & Mishel, L. (2013). The Pay of Corporate Executives and Financial Professionals as Evidence of Rents in Top 1 Percent Incomes. Journal of Economic Perspectives, 27(3), 57-78.',
'role': 'assistant'},
{'content': 'Now explain it to a dog', 'role': 'user'}
]
}
```` |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-latex-20000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 988901
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bdsaglam/web_nlg-erx-sft-multi-turn-multi-sentence-sharegpt | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 10316175
num_examples: 5876
- name: dev
num_bytes: 1304269
num_examples: 742
- name: test
num_bytes: 2379889
num_examples: 1230
download_size: 5241796
dataset_size: 14000333
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
|
liuyanchen1015/VALUE_sst2_lexical | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 82922
num_examples: 579
- name: test
num_bytes: 169738
num_examples: 1197
- name: train
num_bytes: 2796697
num_examples: 26645
download_size: 1900719
dataset_size: 3049357
---
# Dataset Card for "VALUE_sst2_lexical"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
clarin-knext/quora-pl | ---
language:
- pl
---
Part of **BEIR-PL: Zero Shot Information Retrieval Benchmark for the Polish Language**.
Link to arxiv: https://arxiv.org/pdf/2305.19840.pdf
Contact: konrad.wojtasik@pwr.edu.pl |
enzostvs/stable-diffusion-tpu-generations | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: "images/*.png"
---
|
2003achu/Caption | ---
license: mit
---
|
pphuc25/cv13-hi-test-vectorized | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: input_length
dtype: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2831277024
num_examples: 2947
download_size: 494107812
dataset_size: 2831277024
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "baseline-eval-vectorized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B | ---
pretty_name: Evaluation run of ycros/BagelMIsteryTour-8x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ycros/BagelMIsteryTour-8x7B](https://huggingface.co/ycros/BagelMIsteryTour-8x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-16T11:57:17.024146](https://huggingface.co/datasets/open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B/blob/main/results_2024-01-16T11-57-17.024146.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7118537836260681,\n\
\ \"acc_stderr\": 0.030292814267622557,\n \"acc_norm\": 0.7154538145620108,\n\
\ \"acc_norm_stderr\": 0.030880099721365066,\n \"mc1\": 0.602203182374541,\n\
\ \"mc1_stderr\": 0.017133934248559676,\n \"mc2\": 0.7494844581449875,\n\
\ \"mc2_stderr\": 0.014345730353310387\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7005119453924915,\n \"acc_stderr\": 0.013385021637313572,\n\
\ \"acc_norm\": 0.7244027303754266,\n \"acc_norm_stderr\": 0.01305716965576184\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6919936267675761,\n\
\ \"acc_stderr\": 0.004607256752931882,\n \"acc_norm\": 0.8750248954391555,\n\
\ \"acc_norm_stderr\": 0.0033001484456091326\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7111111111111111,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.7111111111111111,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775406,\n\
\ \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.034961014811911786,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.034961014811911786\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n\
\ \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6403508771929824,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.6403508771929824,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.040703290137070705,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.040703290137070705\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5158730158730159,\n \"acc_stderr\": 0.02573833063941215,\n \"\
acc_norm\": 0.5158730158730159,\n \"acc_norm_stderr\": 0.02573833063941215\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8580645161290322,\n\
\ \"acc_stderr\": 0.019853003676559747,\n \"acc_norm\": 0.8580645161290322,\n\
\ \"acc_norm_stderr\": 0.019853003676559747\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6108374384236454,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.6108374384236454,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983116,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983116\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"\
acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262577,\n\
\ \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262577\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7153846153846154,\n \"acc_stderr\": 0.0228783227997063,\n \
\ \"acc_norm\": 0.7153846153846154,\n \"acc_norm_stderr\": 0.0228783227997063\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8067226890756303,\n \"acc_stderr\": 0.025649470265889183,\n\
\ \"acc_norm\": 0.8067226890756303,\n \"acc_norm_stderr\": 0.025649470265889183\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.013321348447611753,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.013321348447611753\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6018518518518519,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8725490196078431,\n \"acc_stderr\": 0.02340553048084631,\n \"\
acc_norm\": 0.8725490196078431,\n \"acc_norm_stderr\": 0.02340553048084631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8734177215189873,\n \"acc_stderr\": 0.021644195727955173,\n \
\ \"acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.021644195727955173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.726457399103139,\n\
\ \"acc_stderr\": 0.029918586707798827,\n \"acc_norm\": 0.726457399103139,\n\
\ \"acc_norm_stderr\": 0.029918586707798827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8512396694214877,\n \"acc_stderr\": 0.03248470083807194,\n \"\
acc_norm\": 0.8512396694214877,\n \"acc_norm_stderr\": 0.03248470083807194\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.0334327006286962,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.0334327006286962\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911899,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911899\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786746,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786746\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n\
\ \"acc_stderr\": 0.012086705214250428,\n \"acc_norm\": 0.8684546615581098,\n\
\ \"acc_norm_stderr\": 0.012086705214250428\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4581005586592179,\n\
\ \"acc_stderr\": 0.016663683295020527,\n \"acc_norm\": 0.4581005586592179,\n\
\ \"acc_norm_stderr\": 0.016663683295020527\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888146,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n\
\ \"acc_stderr\": 0.02255244778047801,\n \"acc_norm\": 0.8038585209003215,\n\
\ \"acc_norm_stderr\": 0.02255244778047801\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8271604938271605,\n \"acc_stderr\": 0.021038517770157368,\n\
\ \"acc_norm\": 0.8271604938271605,\n \"acc_norm_stderr\": 0.021038517770157368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5602836879432624,\n \"acc_stderr\": 0.029609912075594116,\n \
\ \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.029609912075594116\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5273794002607562,\n\
\ \"acc_stderr\": 0.012751075788015074,\n \"acc_norm\": 0.5273794002607562,\n\
\ \"acc_norm_stderr\": 0.012751075788015074\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.023886881922440345,\n\
\ \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.023886881922440345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146634,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.02520696315422539,\n\
\ \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.02520696315422539\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.602203182374541,\n\
\ \"mc1_stderr\": 0.017133934248559676,\n \"mc2\": 0.7494844581449875,\n\
\ \"mc2_stderr\": 0.014345730353310387\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8200473559589582,\n \"acc_stderr\": 0.010796468688068684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \
\ \"acc_stderr\": 0.013504357787494039\n }\n}\n```"
repo_url: https://huggingface.co/ycros/BagelMIsteryTour-8x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|arc:challenge|25_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|gsm8k|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hellaswag|10_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-16T11-57-17.024146.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- '**/details_harness|winogrande|5_2024-01-16T11-57-17.024146.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-16T11-57-17.024146.parquet'
- config_name: results
data_files:
- split: 2024_01_16T11_57_17.024146
path:
- results_2024-01-16T11-57-17.024146.parquet
- split: latest
path:
- results_2024-01-16T11-57-17.024146.parquet
---
# Dataset Card for Evaluation run of ycros/BagelMIsteryTour-8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ycros/BagelMIsteryTour-8x7B](https://huggingface.co/ycros/BagelMIsteryTour-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-16T11:57:17.024146](https://huggingface.co/datasets/open-llm-leaderboard/details_ycros__BagelMIsteryTour-8x7B/blob/main/results_2024-01-16T11-57-17.024146.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7118537836260681,
"acc_stderr": 0.030292814267622557,
"acc_norm": 0.7154538145620108,
"acc_norm_stderr": 0.030880099721365066,
"mc1": 0.602203182374541,
"mc1_stderr": 0.017133934248559676,
"mc2": 0.7494844581449875,
"mc2_stderr": 0.014345730353310387
},
"harness|arc:challenge|25": {
"acc": 0.7005119453924915,
"acc_stderr": 0.013385021637313572,
"acc_norm": 0.7244027303754266,
"acc_norm_stderr": 0.01305716965576184
},
"harness|hellaswag|10": {
"acc": 0.6919936267675761,
"acc_stderr": 0.004607256752931882,
"acc_norm": 0.8750248954391555,
"acc_norm_stderr": 0.0033001484456091326
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775406,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.034961014811911786,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.034961014811911786
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6403508771929824,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.6403508771929824,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.040703290137070705,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.040703290137070705
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.02573833063941215,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.02573833063941215
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8580645161290322,
"acc_stderr": 0.019853003676559747,
"acc_norm": 0.8580645161290322,
"acc_norm_stderr": 0.019853003676559747
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6108374384236454,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.6108374384236454,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983116,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983116
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262577,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262577
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7153846153846154,
"acc_stderr": 0.0228783227997063,
"acc_norm": 0.7153846153846154,
"acc_norm_stderr": 0.0228783227997063
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8067226890756303,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.8067226890756303,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.013321348447611753,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.013321348447611753
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8725490196078431,
"acc_stderr": 0.02340553048084631,
"acc_norm": 0.8725490196078431,
"acc_norm_stderr": 0.02340553048084631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.021644195727955173,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.021644195727955173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.726457399103139,
"acc_stderr": 0.029918586707798827,
"acc_norm": 0.726457399103139,
"acc_norm_stderr": 0.029918586707798827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8512396694214877,
"acc_stderr": 0.03248470083807194,
"acc_norm": 0.8512396694214877,
"acc_norm_stderr": 0.03248470083807194
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0334327006286962,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0334327006286962
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911899,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911899
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786746,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786746
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.012086705214250428,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.012086705214250428
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4581005586592179,
"acc_stderr": 0.016663683295020527,
"acc_norm": 0.4581005586592179,
"acc_norm_stderr": 0.016663683295020527
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888146,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.02255244778047801,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.02255244778047801
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8271604938271605,
"acc_stderr": 0.021038517770157368,
"acc_norm": 0.8271604938271605,
"acc_norm_stderr": 0.021038517770157368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.029609912075594116,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.029609912075594116
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5273794002607562,
"acc_stderr": 0.012751075788015074,
"acc_norm": 0.5273794002607562,
"acc_norm_stderr": 0.012751075788015074
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.023886881922440345,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.023886881922440345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146634,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.02520696315422539,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.02520696315422539
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.602203182374541,
"mc1_stderr": 0.017133934248559676,
"mc2": 0.7494844581449875,
"mc2_stderr": 0.014345730353310387
},
"harness|winogrande|5": {
"acc": 0.8200473559589582,
"acc_stderr": 0.010796468688068684
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494039
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
aeropriest/ariel | ---
license: openrail
---
|
Lam-ia/Wikipedia-Euskera-dataseta | ---
license: apache-2.0
task_categories:
- text-generation
language:
- eu
pretty_name: Wikipedia Euskeraz
size_categories:
- 1K<n<10K
--- |
CyberHarem/tanaka_kotoha_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of tanaka_kotoha/田中琴葉/타나카코토하 (THE iDOLM@STER: Million Live!)
This is the dataset of tanaka_kotoha/田中琴葉/타나카코토하 (THE iDOLM@STER: Million Live!), containing 355 images and their tags.
The core tags of this character are `long_hair, brown_hair, brown_eyes, hairband, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 355 | 414.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_kotoha_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 355 | 255.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_kotoha_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 818 | 528.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_kotoha_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 355 | 374.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_kotoha_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 818 | 724.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tanaka_kotoha_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tanaka_kotoha_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, solo, looking_at_viewer, open_mouth, blush, skirt, cleavage, medium_breasts, navel, necklace, :d, hair_ornament |
| 1 | 7 |  |  |  |  |  | 1girl, solo, skirt, blush, open_mouth, :d, looking_at_viewer, bow, crown, microphone, necktie |
| 2 | 7 |  |  |  |  |  | navel, 1girl, blush, looking_at_viewer, sailor_bikini, smile, solo, white_bikini, cowboy_shot, open_mouth, standing, bare_arms, bare_shoulders, blue_neckerchief, blue_sky, blunt_bangs, collarbone, day, hat, medium_breasts, outdoors, arms_up, beach, blue_sailor_collar, bow_bikini, cleavage, cloud, white_hairband, white_headwear |
| 3 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, simple_background, solo, white_background, closed_mouth, smile, upper_body, yellow_hairband, white_shirt, arms_up, long_sleeves, short_sleeves, tying_hair, very_long_hair |
| 4 | 10 |  |  |  |  |  | plaid_skirt, pleated_skirt, white_shirt, 1girl, school_uniform, solo, white_background, simple_background, blush, hair_ribbon, looking_at_viewer, miniskirt, socks, long_sleeves, blazer, blue_skirt, bow, closed_mouth, diagonal-striped_necktie, diagonal_stripes, dress_shirt, green_ribbon, red_skirt, wing_collar, brown_footwear, crotch_seam, loafers, short_sleeves, standing, sweater_vest, white_panties |
| 5 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, open_mouth, penis, solo_focus, vaginal, female_pubic_hair, looking_at_viewer, pussy, sweat, bar_censor, clothed_sex, navel, nipples, pov, spread_legs, :d, lying, m_legs, no_panties, nude, serafuku, skirt_lift |
| 6 | 9 |  |  |  |  |  | 1girl, solo, belt, epaulettes, looking_at_viewer, red_eyes, smile, dark_persona, red_jacket, black_gloves, cape, long_sleeves, open_mouth, pleated_skirt, white_shirt, black_hairband, black_thighhighs, boots, collared_shirt, holding_sword, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | open_mouth | blush | skirt | cleavage | medium_breasts | navel | necklace | :d | hair_ornament | bow | crown | microphone | necktie | sailor_bikini | smile | white_bikini | cowboy_shot | standing | bare_arms | bare_shoulders | blue_neckerchief | blue_sky | blunt_bangs | collarbone | day | hat | outdoors | arms_up | beach | blue_sailor_collar | bow_bikini | cloud | white_hairband | white_headwear | simple_background | white_background | closed_mouth | upper_body | yellow_hairband | white_shirt | long_sleeves | short_sleeves | tying_hair | very_long_hair | plaid_skirt | pleated_skirt | school_uniform | hair_ribbon | miniskirt | socks | blazer | blue_skirt | diagonal-striped_necktie | diagonal_stripes | dress_shirt | green_ribbon | red_skirt | wing_collar | brown_footwear | crotch_seam | loafers | sweater_vest | white_panties | 1boy | hetero | penis | solo_focus | vaginal | female_pubic_hair | pussy | sweat | bar_censor | clothed_sex | nipples | pov | spread_legs | lying | m_legs | no_panties | nude | serafuku | skirt_lift | belt | epaulettes | red_eyes | dark_persona | red_jacket | black_gloves | cape | black_hairband | black_thighhighs | boots | collared_shirt | holding_sword | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-------------|:--------|:--------|:-----------|:-----------------|:--------|:-----------|:-----|:----------------|:------|:--------|:-------------|:----------|:----------------|:--------|:---------------|:--------------|:-----------|:------------|:-----------------|:-------------------|:-----------|:--------------|:-------------|:------|:------|:-----------|:----------|:--------|:---------------------|:-------------|:--------|:-----------------|:-----------------|:--------------------|:-------------------|:---------------|:-------------|:------------------|:--------------|:---------------|:----------------|:-------------|:-----------------|:--------------|:----------------|:-----------------|:--------------|:------------|:--------|:---------|:-------------|:---------------------------|:-------------------|:--------------|:---------------|:------------|:--------------|:-----------------|:--------------|:----------|:---------------|:----------------|:-------|:---------|:--------|:-------------|:----------|:--------------------|:--------|:--------|:-------------|:--------------|:----------|:------|:--------------|:--------|:---------|:-------------|:-------|:-----------|:-------------|:-------|:-------------|:-----------|:---------------|:-------------|:---------------|:-------|:-----------------|:-------------------|:--------|:-----------------|:----------------|:----------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | | | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | X | | X | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | X | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | X | X | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/igrene_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of igrene (Fire Emblem)
This is the dataset of igrene (Fire Emblem), containing 198 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, large_breasts, dark_skin, mole, yellow_eyes, dark-skinned_female, mole_under_eye`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 198 | 321.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/igrene_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 198 | 160.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/igrene_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 481 | 340.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/igrene_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 198 | 272.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/igrene_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 481 | 513.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/igrene_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/igrene_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1boy, 1girl, hetero, penis, solo_focus, facial, nipples, nude, blush, cum_on_breasts, mosaic_censoring, paizuri, sweat, tears |
| 1 | 5 |  |  |  |  |  | 1girl, black_gloves, elbow_gloves, nipples, thigh_boots, thighhighs, 1boy, cum_in_pussy, hetero, no_mole, on_back, penis, red_dress, vaginal, blush, breasts_out, clothed_sex, female_pubic_hair, fingerless_gloves, mosaic_censoring, open_mouth, spread_legs, bar_censor, cape, missionary, scarf, solo_focus, torn_clothes |
| 2 | 6 |  |  |  |  |  | 1girl, belt, black_gloves, cleavage, elbow_gloves, fingerless_gloves, quiver, red_dress, short_dress, solo, thigh_boots, thighhighs, arrow_(projectile), black_footwear, bow_(weapon), brown_cape, looking_at_viewer, smile |
| 3 | 10 |  |  |  |  |  | 1girl, black_gloves, cleavage, elbow_gloves, ninja, official_alternate_costume, solo, thighhighs, fingerless_gloves, kunai, looking_at_viewer, red_dress, thighs, brown_scarf, pelvic_curtain, smile, thigh_boots, cape, holding, night |
| 4 | 5 |  |  |  |  |  | 1girl, blush, nipples, smile, solo, bangs, collarbone, completely_nude, hair_between_eyes, looking_at_viewer, navel, pussy, barefoot, holding, standing, very_long_hair, blurry, brown_eyes, lipstick, lying, shiny_skin, simple_background, uncensored, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1boy | 1girl | hetero | penis | solo_focus | facial | nipples | nude | blush | cum_on_breasts | mosaic_censoring | paizuri | sweat | tears | black_gloves | elbow_gloves | thigh_boots | thighhighs | cum_in_pussy | no_mole | on_back | red_dress | vaginal | breasts_out | clothed_sex | female_pubic_hair | fingerless_gloves | open_mouth | spread_legs | bar_censor | cape | missionary | scarf | torn_clothes | belt | cleavage | quiver | short_dress | solo | arrow_(projectile) | black_footwear | bow_(weapon) | brown_cape | looking_at_viewer | smile | ninja | official_alternate_costume | kunai | thighs | brown_scarf | pelvic_curtain | holding | night | bangs | collarbone | completely_nude | hair_between_eyes | navel | pussy | barefoot | standing | very_long_hair | blurry | brown_eyes | lipstick | lying | shiny_skin | simple_background | uncensored | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------|:--------|:---------|:--------|:-------------|:---------|:----------|:-------|:--------|:-----------------|:-------------------|:----------|:--------|:--------|:---------------|:---------------|:--------------|:-------------|:---------------|:----------|:----------|:------------|:----------|:--------------|:--------------|:--------------------|:--------------------|:-------------|:--------------|:-------------|:-------|:-------------|:--------|:---------------|:-------|:-----------|:---------|:--------------|:-------|:---------------------|:-----------------|:---------------|:-------------|:--------------------|:--------|:--------|:-----------------------------|:--------|:---------|:--------------|:-----------------|:----------|:--------|:--------|:-------------|:------------------|:--------------------|:--------|:--------|:-----------|:-----------|:-----------------|:---------|:-------------|:-----------|:--------|:-------------|:--------------------|:-------------|:-------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | X | | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | | X | | | | | | | | | | | | | X | X | X | X | | | | X | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | | X | | | | | | | | | | | | | X | X | X | X | | | | X | | | | | X | | | | X | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
FadiBoutros/IDiff-Face | ---
license: cc-by-nc-sa-4.0
---
|
heliosprime/twitter_dataset_1713016351 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11650
num_examples: 26
download_size: 9632
dataset_size: 11650
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713016351"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HANTIFARAH/wikipedia | ---
dataset_info:
config_name: wikimedia__wikipedia__ar
features:
- name: text
dtype: string
- name: source
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 3187890159
num_examples: 1219201
download_size: 1326401260
dataset_size: 3187890159
configs:
- config_name: wikimedia__wikipedia__ar
data_files:
- split: train
path: wikimedia__wikipedia__ar/train-*
---
|
joey234/mmlu-professional_law-neg | ---
dataset_info:
features:
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: question
dtype: string
splits:
- name: test
num_bytes: 1019787
num_examples: 1534
download_size: 554777
dataset_size: 1019787
---
# Dataset Card for "mmlu-professional_law-neg"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
carnival13/hpqa_generator_input | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 260134640
num_examples: 72340
- name: validation
num_bytes: 65033660
num_examples: 18085
- name: test
num_bytes: 26624784
num_examples: 7404
download_size: 25644638
dataset_size: 351793084
---
# Dataset Card for "hpqa_generator_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kgr123/quality_counter_4500 | ---
dataset_info:
features:
- name: context
dtype: string
- name: word
dtype: string
- name: claim
dtype: string
- name: label
dtype: int64
splits:
- name: test
num_bytes: 24696573
num_examples: 1929
- name: train
num_bytes: 24496491
num_examples: 1935
- name: validation
num_bytes: 24981972
num_examples: 1941
download_size: 16350150
dataset_size: 74175036
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Dhika/defectdata | ---
license: unknown
---
|
heliosprime/twitter_dataset_1713190896 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 19209
num_examples: 51
download_size: 18229
dataset_size: 19209
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713190896"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yongsun-yoon/open-ner-english | ---
dataset_info:
features:
- name: text
dtype: string
- name: entities
list:
- name: entity_mentions
sequence: string
- name: entity_type
dtype: string
splits:
- name: train
num_bytes: 51881074.954063065
num_examples: 36711
- name: validation
num_bytes: 12970622.045936935
num_examples: 9178
download_size: 40944137
dataset_size: 64851697.0
---
# Open NER (English)"
This is the processed version of [Universal-NER/Pile-NER-type](https://huggingface.co/datasets/Universal-NER/Pile-NER-type). |
gianlucar/rugby_test_2 | ---
task_categories:
- text-generation
language:
- en
tags:
- fine-tuning
- touch rugby
size_categories:
- n<1K
---
# Touch Rugby Rules Dataset (for embeddings)
train.csv is taken from the [International Touch Website](https://cdn.internationaltouch.org/public/FIT%205th%20Edition%20Rulebook.pdf)
test.csv is copy pasted from abbreviated rules on the [UK Touch website](https://www.englandtouch.org.uk/develop/coaching/the-rules/). Note that I'm bypassing the pdf to text stage.
All text is chunked to a length of 100 tokens with 50% overlap.
For educational and non-commercial use only. |
TIGER-Lab/SKGInstruct | ---
license: cc-by-nc-2.0
task_categories:
- text-generation
language:
- en
pretty_name: SKGInstruct
size_categories:
- 100K<n<1M
tags:
- code
- SKG
configs:
- config_name: default
data_files:
- split: train
path: "skginstruct.json"
- split: test
path: "skginstruct_test_file_7b.json"
---
# 🏗️ StructLM: Towards Building Generalist Models for Structured Knowledge Grounding
SKGInstruct is an instruction tuning dataset constructed from 19 structured knowledge grounding datasets, mixed with 🤗 [SlimOrca](https://huggingface.co/datasets/Open-Orca/SlimOrca)
Project Page: [https://tiger-ai-lab.github.io/StructLM/](https://tiger-ai-lab.github.io/StructLM/)
Paper: [https://arxiv.org/pdf/2402.16671.pdf](https://arxiv.org/pdf/2402.16671.pdf)
Code: [https://github.com/TIGER-AI-Lab/StructLM](https://github.com/TIGER-AI-Lab/StructLM)
Models:
7B | [StructLM-7B](https://huggingface.co/TIGER-Lab/StructLM-7B)
13B | [StructLM-13B](https://huggingface.co/TIGER-Lab/StructLM-13B)
34B | [StructLM-34B](https://huggingface.co/TIGER-Lab/StructLM-34B)
## **License**
| Dataset Name | License Type |
|--------------|----------------|
| TabMWP | [Attribution-ShareAlike 4.0 International](https://creativecommons.org/licenses/by-sa/4.0/)|
| SlimOrca | MIT |
| everything else | [Attribution-NonCommercial-ShareAlike 4.0 International](https://creativecommons.org/licenses/by-nc-sa/4.0/)|
## **Citation**
Please cite our paper if you use our data, model or code. Please also kindly cite the original dataset papers.
```
@misc{zhuang2024structlm,
title={StructLM: Towards Building Generalist Models for Structured Knowledge Grounding},
author={Alex Zhuang and Ge Zhang and Tianyu Zheng and Xinrun Du and Junjie Wang and Weiming Ren and Stephen W. Huang and Jie Fu and Xiang Yue and Wenhu Chen},
year={2024},
eprint={2402.16671},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
shidowake/augmxnt_ultra-orca-boros-en-ja-v1_split_6 | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: float64
- name: source
dtype: string
splits:
- name: train
num_bytes: 20639999.933149945
num_examples: 9397
download_size: 10516607
dataset_size: 20639999.933149945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
aloobun/mfds | ---
license: wtfpl
--- |
Gummybear05/Y_frequency_speed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float32
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 2322247497
num_examples: 5400
download_size: 2348923241
dataset_size: 2322247497
---
# Dataset Card for "Y_frequency_speed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
plural-user/trump-tweets-ray | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4865384
num_examples: 1
- name: test
num_bytes: 272829
num_examples: 1
- name: validation
num_bytes: 271308
num_examples: 1
download_size: 3720431
dataset_size: 5409521
---
# Dataset Card for "trump-tweets-ray"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
boopysaur/user2-dpo-small | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 263014
num_examples: 1601
download_size: 165208
dataset_size: 263014
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b | ---
pretty_name: Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-20b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h2oai/h2ogpt-gm-oasst1-en-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T11:30:13.056869](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b/blob/main/results_2023-10-17T11-30-13.056869.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007340604026845637,\n\
\ \"em_stderr\": 0.0008741896875346127,\n \"f1\": 0.059098154362416305,\n\
\ \"f1_stderr\": 0.001498238863999793,\n \"acc\": 0.3440011895836038,\n\
\ \"acc_stderr\": 0.008793500407979769\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007340604026845637,\n \"em_stderr\": 0.0008741896875346127,\n\
\ \"f1\": 0.059098154362416305,\n \"f1_stderr\": 0.001498238863999793\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \
\ \"acc_stderr\": 0.004302045046564285\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6629834254143646,\n \"acc_stderr\": 0.013284955769395252\n\
\ }\n}\n```"
repo_url: https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T11_30_13.056869
path:
- '**/details_harness|drop|3_2023-10-17T11-30-13.056869.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T11-30-13.056869.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T11_30_13.056869
path:
- '**/details_harness|gsm8k|5_2023-10-17T11-30-13.056869.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T11-30-13.056869.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:35:35.780060.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T21:35:35.780060.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T11_30_13.056869
path:
- '**/details_harness|winogrande|5_2023-10-17T11-30-13.056869.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T11-30-13.056869.parquet'
- config_name: results
data_files:
- split: 2023_07_19T21_35_35.780060
path:
- results_2023-07-19T21:35:35.780060.parquet
- split: 2023_10_17T11_30_13.056869
path:
- results_2023-10-17T11-30-13.056869.parquet
- split: latest
path:
- results_2023-10-17T11-30-13.056869.parquet
---
# Dataset Card for Evaluation run of h2oai/h2ogpt-gm-oasst1-en-1024-20b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [h2oai/h2ogpt-gm-oasst1-en-1024-20b](https://huggingface.co/h2oai/h2ogpt-gm-oasst1-en-1024-20b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T11:30:13.056869](https://huggingface.co/datasets/open-llm-leaderboard/details_h2oai__h2ogpt-gm-oasst1-en-1024-20b/blob/main/results_2023-10-17T11-30-13.056869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007340604026845637,
"em_stderr": 0.0008741896875346127,
"f1": 0.059098154362416305,
"f1_stderr": 0.001498238863999793,
"acc": 0.3440011895836038,
"acc_stderr": 0.008793500407979769
},
"harness|drop|3": {
"em": 0.007340604026845637,
"em_stderr": 0.0008741896875346127,
"f1": 0.059098154362416305,
"f1_stderr": 0.001498238863999793
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.004302045046564285
},
"harness|winogrande|5": {
"acc": 0.6629834254143646,
"acc_stderr": 0.013284955769395252
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
tasksource/ReSQ | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: story
dtype: string
- name: question
dtype: string
- name: q_type
dtype: string
- name: answer
sequence: string
- name: candidate_answers
sequence: string
splits:
- name: train
num_bytes: 388340
num_examples: 2450
- name: dev
num_bytes: 111658
num_examples: 663
download_size: 58830
dataset_size: 499998
---
# Dataset Card for "ReSQ"
https://github.com/HLR/SpaRTUN
```bib
@inproceedings{mirzaee-kordjamshidi-2022-transfer,
title = "Transfer Learning with Synthetic Corpora for Spatial Role Labeling and Reasoning",
author = "Mirzaee, Roshanak and
Kordjamshidi, Parisa",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.413",
pages = "6148--6165",
abstract = "",
}
``` |
atmallen/quirky_addition_increment3_bob_easy | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: alice_label
dtype: bool
- name: bob_label
dtype: bool
- name: difficulty
dtype: int64
- name: statement
dtype: string
- name: choices
sequence: string
- name: character
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 715448.95114
num_examples: 10591
- name: validation
num_bytes: 74930.994
num_examples: 1110
- name: test
num_bytes: 70640.3548
num_examples: 1046
download_size: 231170
dataset_size: 861020.2999399999
---
# Dataset Card for "quirky_addition_increment3_bob_easy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_regularized_reflexives | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 15280
num_examples: 80
- name: test
num_bytes: 171179
num_examples: 869
- name: train
num_bytes: 152619
num_examples: 762
download_size: 190118
dataset_size: 339078
---
# Dataset Card for "MULTI_VALUE_qqp_regularized_reflexives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cjhyeok/fc_test_data | ---
license: apache-2.0
dataset_info:
features:
- name: system
dtype: string
- name: human
dtype: string
- name: gpt
dtype: string
splits:
- name: train
num_bytes: 76780520
num_examples: 58946
download_size: 10585806
dataset_size: 76780520
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/kehlani | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/kehlani"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.497195 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/a77a2cb56da25c8f9e895bc1df12252b.750x750x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/kehlani">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kehlani</div>
<a href="https://genius.com/artists/kehlani">
<div style="text-align: center; font-size: 14px;">@kehlani</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/kehlani).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kehlani")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|256| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/kehlani")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
ali-alkhars/cv-tips | ---
language:
- en
tags:
- cv
- resume
- tips
- jobs
pretty_name: CV Tips Dataset
size_categories:
- n<1K
---
This dataset is used to train LMs to suggest CV improvements tips. It's designed to recognise both 'CV' and 'Resume'.
### Dataset Sources
- https://huggingface.co/datasets/gkrishnan/Resume_Best_Practices
- https://www.indeed.com/career-advice/resumes-cover-letters/enhance-cv
- https://www.topuniversities.com/blog/7-simple-effective-ways-make-your-cv-stand-out
- https://www.forbes.com/sites/andrewfennell/2022/02/22/5-ways-to-improve-your-cv/
- https://www.indeed.com/career-advice/resumes-cover-letters/10-resume-writing-tips
- https://www.coursera.org/articles/ways-to-enhance-your-resume
- https://www.wsj.com/articles/18-resume-writing-tips-to-get-you-noticed-11605107446
- https://www.indeed.com/career-advice/resumes-cover-letters/how-to-improve-resume
- https://www.businessnewsdaily.com/3207-resume-writing-tips.html
- https://www.themuse.com/advice/43-resume-tips-that-will-help-you-get-hired
- https://www.themuse.com/advice/how-to-write-software-engineer-resume-example |
Thien0103/DeepFake_Extracted_Face_Images | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': fake
'1': real
splits:
- name: train
num_bytes: 29710554.314
num_examples: 2798
download_size: 29682114
dataset_size: 29710554.314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HanxuHU/MMMU_filter | ---
dataset_info:
- config_name: Accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 106541.06666666667
num_examples: 2
download_size: 188911
dataset_size: 106541.06666666667
- config_name: Agriculture
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 119217398.0
num_examples: 30
download_size: 119223107
dataset_size: 119217398.0
- config_name: Architecture_and_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 432711.2
num_examples: 18
download_size: 467361
dataset_size: 432711.2
- config_name: Art
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 29934374.0
num_examples: 30
download_size: 29939738
dataset_size: 29934374.0
- config_name: Art_Theory
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 33481398.0
num_examples: 30
download_size: 29783868
dataset_size: 33481398.0
- config_name: Basic_Medical_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 3988243.6333333333
num_examples: 29
download_size: 4093528
dataset_size: 3988243.6333333333
- config_name: Biology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 7642516.399999999
num_examples: 27
download_size: 8021775
dataset_size: 7642516.399999999
- config_name: Chemistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1366537.8
num_examples: 27
download_size: 1362901
dataset_size: 1366537.8
- config_name: Clinical_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 10882324.0
num_examples: 30
download_size: 10888251
dataset_size: 10882324.0
- config_name: Computer_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1933724.1333333333
num_examples: 28
download_size: 2009738
dataset_size: 1933724.1333333333
- config_name: Design
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 17922960.0
num_examples: 30
download_size: 16227878
dataset_size: 17922960.0
- config_name: Diagnostics_and_Laboratory_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 37106073.0
num_examples: 30
download_size: 37089865
dataset_size: 37106073.0
- config_name: Economics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 644315.3666666667
num_examples: 13
download_size: 927250
dataset_size: 644315.3666666667
- config_name: Electronics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 641217.0
num_examples: 30
download_size: 644538
dataset_size: 641217.0
- config_name: Energy_and_Power
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1641775.0
num_examples: 30
download_size: 1646107
dataset_size: 1641775.0
- config_name: Finance
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 35696.36666666667
num_examples: 1
download_size: 31566
dataset_size: 35696.36666666667
- config_name: Geography
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 6448781.533333333
num_examples: 29
download_size: 6611992
dataset_size: 6448781.533333333
- config_name: History
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8231713.2
num_examples: 28
download_size: 8206800
dataset_size: 8231713.2
- config_name: Literature
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 14240886.0
num_examples: 30
download_size: 14246788
dataset_size: 14240886.0
- config_name: Manage
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1966381.8
num_examples: 18
download_size: 2083274
dataset_size: 1966381.8
- config_name: Marketing
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 343658.13333333336
num_examples: 7
download_size: 859324
dataset_size: 343658.13333333336
- config_name: Materials
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1997713.0
num_examples: 26
download_size: 2199364
dataset_size: 1997713.0
- config_name: Math
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1396152.7
num_examples: 29
download_size: 1435925
dataset_size: 1396152.7
- config_name: Mechanical_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 874828.0
num_examples: 30
download_size: 876772
dataset_size: 874828.0
- config_name: Music
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 9359212.0
num_examples: 30
download_size: 9363650
dataset_size: 9359212.0
- config_name: Pharmacy
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1435395.4
num_examples: 26
download_size: 1330630
dataset_size: 1435395.4
- config_name: Physics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1113970.0
num_examples: 30
download_size: 1117086
dataset_size: 1113970.0
- config_name: Psychology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 3963314.7
num_examples: 27
download_size: 3978658
dataset_size: 3963314.7
- config_name: Public_Health
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 251434.0
num_examples: 5
download_size: 672165
dataset_size: 251434.0
- config_name: Sociology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 17840003.766666666
num_examples: 29
download_size: 17595987
dataset_size: 17840003.766666666
configs:
- config_name: Accounting
data_files:
- split: validation
path: Accounting/validation-*
- config_name: Agriculture
data_files:
- split: validation
path: Agriculture/validation-*
- config_name: Architecture_and_Engineering
data_files:
- split: validation
path: Architecture_and_Engineering/validation-*
- config_name: Art
data_files:
- split: validation
path: Art/validation-*
- config_name: Art_Theory
data_files:
- split: validation
path: Art_Theory/validation-*
- config_name: Basic_Medical_Science
data_files:
- split: validation
path: Basic_Medical_Science/validation-*
- config_name: Biology
data_files:
- split: validation
path: Biology/validation-*
- config_name: Chemistry
data_files:
- split: validation
path: Chemistry/validation-*
- config_name: Clinical_Medicine
data_files:
- split: validation
path: Clinical_Medicine/validation-*
- config_name: Computer_Science
data_files:
- split: validation
path: Computer_Science/validation-*
- config_name: Design
data_files:
- split: validation
path: Design/validation-*
- config_name: Diagnostics_and_Laboratory_Medicine
data_files:
- split: validation
path: Diagnostics_and_Laboratory_Medicine/validation-*
- config_name: Economics
data_files:
- split: validation
path: Economics/validation-*
- config_name: Electronics
data_files:
- split: validation
path: Electronics/validation-*
- config_name: Energy_and_Power
data_files:
- split: validation
path: Energy_and_Power/validation-*
- config_name: Finance
data_files:
- split: validation
path: Finance/validation-*
- config_name: Geography
data_files:
- split: validation
path: Geography/validation-*
- config_name: History
data_files:
- split: validation
path: History/validation-*
- config_name: Literature
data_files:
- split: validation
path: Literature/validation-*
- config_name: Manage
data_files:
- split: validation
path: Manage/validation-*
- config_name: Marketing
data_files:
- split: validation
path: Marketing/validation-*
- config_name: Materials
data_files:
- split: validation
path: Materials/validation-*
- config_name: Math
data_files:
- split: validation
path: Math/validation-*
- config_name: Mechanical_Engineering
data_files:
- split: validation
path: Mechanical_Engineering/validation-*
- config_name: Music
data_files:
- split: validation
path: Music/validation-*
- config_name: Pharmacy
data_files:
- split: validation
path: Pharmacy/validation-*
- config_name: Physics
data_files:
- split: validation
path: Physics/validation-*
- config_name: Psychology
data_files:
- split: validation
path: Psychology/validation-*
- config_name: Public_Health
data_files:
- split: validation
path: Public_Health/validation-*
- config_name: Sociology
data_files:
- split: validation
path: Sociology/validation-*
---
|
open-llm-leaderboard/details_paulml__NeuralOmniWestBeaglake-7B | ---
pretty_name: Evaluation run of paulml/NeuralOmniWestBeaglake-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [paulml/NeuralOmniWestBeaglake-7B](https://huggingface.co/paulml/NeuralOmniWestBeaglake-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_paulml__NeuralOmniWestBeaglake-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-05T09:58:00.255850](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NeuralOmniWestBeaglake-7B/blob/main/results_2024-02-05T09-58-00.255850.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6452309636710489,\n\
\ \"acc_stderr\": 0.03228854730174679,\n \"acc_norm\": 0.6455942939381135,\n\
\ \"acc_norm_stderr\": 0.03295573791186191,\n \"mc1\": 0.6070991432068543,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.7509762464971609,\n\
\ \"mc2_stderr\": 0.014412765533767163\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n\
\ \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351335\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7450707030472018,\n\
\ \"acc_stderr\": 0.004349307702735165,\n \"acc_norm\": 0.8969328818960366,\n\
\ \"acc_norm_stderr\": 0.0030342513629399073\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.038035102483515854,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.038035102483515854\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n\
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092427,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092427\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621133,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621133\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516301,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516301\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179326,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179326\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4581005586592179,\n\
\ \"acc_stderr\": 0.016663683295020524,\n \"acc_norm\": 0.4581005586592179,\n\
\ \"acc_norm_stderr\": 0.016663683295020524\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.5783132530120482,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6070991432068543,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.7509762464971609,\n\
\ \"mc2_stderr\": 0.014412765533767163\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5921152388172858,\n \
\ \"acc_stderr\": 0.013536742075643086\n }\n}\n```"
repo_url: https://huggingface.co/paulml/NeuralOmniWestBeaglake-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|arc:challenge|25_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|gsm8k|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hellaswag|10_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T09-58-00.255850.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-05T09-58-00.255850.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- '**/details_harness|winogrande|5_2024-02-05T09-58-00.255850.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-05T09-58-00.255850.parquet'
- config_name: results
data_files:
- split: 2024_02_05T09_58_00.255850
path:
- results_2024-02-05T09-58-00.255850.parquet
- split: latest
path:
- results_2024-02-05T09-58-00.255850.parquet
---
# Dataset Card for Evaluation run of paulml/NeuralOmniWestBeaglake-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [paulml/NeuralOmniWestBeaglake-7B](https://huggingface.co/paulml/NeuralOmniWestBeaglake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_paulml__NeuralOmniWestBeaglake-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-05T09:58:00.255850](https://huggingface.co/datasets/open-llm-leaderboard/details_paulml__NeuralOmniWestBeaglake-7B/blob/main/results_2024-02-05T09-58-00.255850.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6452309636710489,
"acc_stderr": 0.03228854730174679,
"acc_norm": 0.6455942939381135,
"acc_norm_stderr": 0.03295573791186191,
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.7509762464971609,
"mc2_stderr": 0.014412765533767163
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.7450707030472018,
"acc_stderr": 0.004349307702735165,
"acc_norm": 0.8969328818960366,
"acc_norm_stderr": 0.0030342513629399073
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.038035102483515854,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.038035102483515854
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092427,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092427
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621133,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621133
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516301,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516301
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179326,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179326
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4581005586592179,
"acc_stderr": 0.016663683295020524,
"acc_norm": 0.4581005586592179,
"acc_norm_stderr": 0.016663683295020524
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.7509762464971609,
"mc2_stderr": 0.014412765533767163
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.5921152388172858,
"acc_stderr": 0.013536742075643086
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
cassanof/python_pass_fail_sample | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 67163
num_examples: 100
download_size: 42182
dataset_size: 67163
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TheMrguiller/BilbaoQA2 | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
- name: question
dtype: string
- name: choices
dtype: string
- name: answer
dtype: string
- name: solution
dtype: string
- name: CTH
dtype: bool
splits:
- name: train
num_bytes: 3191572848.375
num_examples: 3781
- name: test
num_bytes: 791791803
num_examples: 946
download_size: 3952860603
dataset_size: 3983364651.375
task_categories:
- question-answering
- visual-question-answering
language:
- en
tags:
- code
size_categories:
- 100B<n<1T
---
# Dataset Card for "BilbaoQA2"
## Dataset Description
- **Homepage:** https://github.com/TheMrguiller/MUCSI_Modal
- **Repository:** https://github.com/TheMrguiller/MUCSI_Modal
- **Paper:** It is a follow up of the Flamingo model paper
- **Leaderboard:**
- **Point of Contact:** https://github.com/TheMrguiller/MUCSI_Modal
### Dataset Summary
This dataset was collected for a proyect for a master degree in Computation and Intelligent System from University of Deusto. It was done by students and recolected from webpages famous in the Basque Country: Deia and Getimages. The questions and answers were created using a set of models that are able to generate this information from a description of a text.
### Supported Tasks and Leaderboards
The dataset is prepared to used it for visual question-answering.
### Languages
The dataset is in english.
## Dataset Structure
### Data Fields
- `image`: This field has the image, which is the context given to the model.
- `question`: This field incorporates the question that has to answer the model from the image context.
- `choices`: Multiple choice selection.
- `answer`: The answer from the multiple choice.
- `solution`: The chain of thought process of the solution selection.
- `CTH`: A flag that indicates whether it doesnt have chain of thought in that row.
### Data Splits
The dataset is split in 80% train and 20% test.
## Considerations for Using the Data
The dataset has some flaws regarding to the descriptions. The descriptions sometimes are to specific for a captioning task. There are also to many futbol match data, so it isnt to well balanced. There are also some description that are to generic. There are some repetition in the answers due to the bad quality of the descriptions, be aware of this.
## Additional Information
### Dataset Curators
The curators of this dataset where the students from the Masters degree in Computation and Inteligent Systems from University of Deusto. |
Mitsuki-Sakamoto/alpaca_farm-alpaca_instructions-re-preference | ---
dataset_info:
config_name: reward-model-deberta-v3-large-v2-deberta_sep-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
splits:
- name: val
num_bytes: 6564006
num_examples: 2000
- name: preference
num_bytes: 65488935
num_examples: 20001
download_size: 31650438
dataset_size: 72052941
configs:
- config_name: reward-model-deberta-v3-large-v2-deberta_sep-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500
data_files:
- split: val
path: reward-model-deberta-v3-large-v2-deberta_sep-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/val-*
- split: preference
path: reward-model-deberta-v3-large-v2-deberta_sep-pythia-1.4b_alpaca_farm_instructions_sft_constant_pa-checkpoint-7500/preference-*
---
# Dataset Card for "alpaca_farm-alpaca_instructions-re-preference"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
omrinach/chroma_guidelines | ---
license: apache-2.0
---
|
T-GEN/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 15344661.0
num_examples: 14
download_size: 14231300
dataset_size: 15344661.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
edumunozsala/instruct-legal-refugiados-es | ---
annotations_creators:
- distillabel
language:
- es
license:
- apache-2.0
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- extractive-qa
pretty_name: Spanish Instruction Question Answering Legal Refugiados
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: prompt
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt_es
dtype: string
- name: source
dtype: string
- name: page
dtype: string
- name: source_ini
dtype: int64
- name: source_fin
dtype: int64
splits:
- name: train
num_bytes: 72348162
num_examples: 11105
download_size: 20278492
dataset_size: 72348162
---
<p align="center">
<img src="https://huggingface.co/datasets/edumunozsala/instruct-legal-refugiados-es/blob/main/markus-winkler-Je1MDuITTF4-unsplash.jpg" style="width: 50%;">
</p>
<h1 align="center">Legal Refugiados: Un dataset para QA en temas legales de refugio, asilo y protección internacional.</h1>
## Dataset Description
Instruction Question-Answering Legal Refugiados es una colección de instrucciones extraídas de una gran cantidad de documentos legales del gobierno de España, principalmente, y de otras instituciones de la UE y también de otros países de habla hispana como México o Venezuela. Todos ellos están relacionados con leyes y disposiciones legales sobre ciudadanos refugiados o inmigrantes, sus derechos, su situación. Después de recopilar todos esos documentos, utilizamos la libreria [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/) para crear un proceso de generación de un dataset de tipo instrucciónes (Question-Answer) parea poder entrenar un modelo en español.
---
Spanish Instruct-Question Answering Legal Refugiados is a collection of instruction queries extracted from a lot of legal documents from the goverment of Spain, mainly, and other UE institutions and also other Spanish speaking countries like Mexico or Venezuela. They all are related to laws and dispositions about refugee or migrant citizens, their rights, their situation. After collecting all those documents, we use the library [distillabel](https://distilabel.argilla.io/latest/) by [Argilla](https://argilla.io/) to create a process to extract instruction format pairs of query-answer samples ion order to train a Spanish language model.
### Dataset Summary
Contiene unas 6.800 filas que contienen los campos:
* instrucción: una instrucción o consulta.
* input: un contexto para resolver la consulta.
* salida: la salida generada a partir del contexto.
* prompt: Un prompt en estilo alpaca para pedir la salida dada la instrucción y la entrada.
* source: tipo de nombre de la fuente original de donde se extrajo la entrada.
* page: número de página de la fuente
---
Contains about 6,800 rows containing the fields:
* instruction: an instruction or query.
* input: a context to solve the query
* output: the generated output from the context.
* prompt: A prompt in alpaca-style to ask for the output given the instruction and input.
* source: kind of the name of the orginal source where the input was extracted.
* page: page number of the source
### Supported Tasks
Text-Generation
### Languages
- Spanish (es)
## Dataset Structure
### Data Instances
<pre>
{'prompt': 'Below is a question in Spanish paired with a context also in Spanish that provides further information to solve the question. Write a response that appropriately completes the request.\n\n### Question:\n¿Podrías explicar en qué principios básicos se fundamenta la Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, según el Real Decreto 203/1995?\n\n### Context:\nReal Decreto 203/1995, de 10 de febrero, por el que se aprueba el Reglamento de aplicación de la Ley 5/1984, ... que deben regir los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.\n\n### Response:\n',
'instruction': '¿Podrías explicar en qué principios básicos se fundamenta la Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, según el Real Decreto 203/1995?',
'input': 'Real Decreto 203/1995, de 10 de febrero, por el que se aprueba el Reglamento de aplicación de la Ley 5/1984 ... deben regir los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.',
'output': 'La Ley 5/1984, relativa al derecho de asilo y a la condición de refugiado, se basa en los siguientes principios fundamentales... garantías adecuadas durante los procedimientos de inadmisión a trámite, tanto en frontera como en el interior del territorio.',
'source': 'BOE_1995_5542_consolidado_asilo_y_refugiado',
'page': '1',
'source_ini': 0,
'source_fin': 1419}
</pre>
### Data Fields
<pre>
{
prompt: str
instruction: str
input: str
output: str
source: str,
page: int,
source_ini: int,
source_fin:int
}
</pre>
### Data Splits
| Split | Size |
| ------------- | ------------- |
| `train` | 5,100 |
## Dataset Creation
Para la creación del dataset hemos empleado la libreria [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/) que proporciona un completo juego de herramientas que facilitan y agilizan enormemente esta labor de creación.
El proceso se dividia en 3 etapas:
- Primer paso: En base a un contexto dado de 512 tokens proveniente de un documento legal, solicitabamos al LLM un par de instrucciones o preguntas que pudieran resolverse con dicho contexto. Usamos un tarea de tipo `SelfInstructTask` y proporcionamos a la tarea una descripción o rol de asistente IA. Invocamos al modelo "mistralai/Mixtral-8x7B-Instruct-v0.1" en un Inference Endpoint en Hugging Face. Como resultado de este proceso obteniamos 1-2 instrucciones o preguntas por cada contexto proporcionado.
- Segundo paso: Generamos un prompt en formato similar a alpaca, donde dada una Instruccion (cada una de las respuestas del paso anterior) y un input o contexto (el correspondiente del paso anterior), se solicita un output.
- Tercer paso: En base a los prompts generados en el paso anterior, usando una tarea de tipo `TextGenerationTask` y con la descripción de aistente AI del paso 1, solicitamos al LLM que nos proporcione la respuesta. Nuevamente, usamos el modelo Mixtral en un Inference Endpoint de Hugging Face.
Agradecer el soporte y la guia proporcionada por Argilla para poder llevar a cabo esta tarea.
biblioteca [distillabel](https://distilabel.argilla.io/latest/) de [Argilla](https://argilla.io/)
---
For the creation of the dataset we have used the [distillabel](https://distilabel.argilla.io/latest/) library by [Argilla](https://argilla.io/) that provides a complete set of tools that facilitate and speed up enormously this work.
The process was divided into 3 steps:
- First step: based on a given context of 512 tokens coming from a legal document, we asked the LLM for a couple of instructions or questions that could be solved with that context. We used a task of type `SelfInstructTask` and assigned the task the rol of an AI assistant. We invoked the model "mistralai/Mixtral-8x7B-Instruct-v0.1" on an Hugging Face Inference Endpoint. As a result of this process we obtained 1-2 instructions or questions for each context provided.
- Second step: We generate a prompt in a format similar to alpaca, where given an Instruction (each of the answers from the previous step) and an input or context (the corresponding one from the previous step), an output is requested.
- Third step: Based on the prompts generated in the previous step, using a `TextGenerationTask` type task and with the AI assistant description from step 1, we requested the LLM to provide us with the answer. Again, we used the Mixtral model on a Hugging Face Inference Endpoint.
Thank you for the support and guidance provided by Argilla in order to accomplish this task.
### Source Data
A continuación, mostramos los nombre "aproximados" de los documentos empleados para extraer todos los contextos que incluye este dataset. Por dicho nombre se puede inferir las leyes o disposiciones legales que contienen:
---
Below, we show the "approximate" names of the documents used to extract all the contexts included in this dataset. By this name it is possible to infer the laws or legal provisions they contain:
**Documents**:
adhesion_espana_estatuto_refugiados_onu.pdf
BOE_1995_5542_consolidado_asilo_y_refugiado.pdf
BOE_2003_19714_consolidado_proteccion_temporal_afluencia_masiva_desplazados.pdf
BOE_decreto_1800_2008_consolidado_abono_acumulado_prestacion_desempleo_extranjeros.pdf
BOE_decreto_203_1995_consolidado_reglamento_derecho_asilo_refugiado.pdf
BOE_decreto_220_2022_consolidado_reglamento_acogida_proteccion_internacional.pdf
BOE_decreto_557_2011_consolidado_reglamento_derechos_libertades_extranjeros_espana.pdf
BOE_decreto_865_2001_consolidado_reconocimiento_estatuto_apatrida.pdf
BOE_ley_organica_4_2000_consolidado_derechos_libertades_extranjeros_espana.pdf
BOE_orden_1282_2007_consolidado_medios_economicos_entrada_extranjeros_espana.pdf
BOE_orden_1283_2007_consolidado_requisitos_carta_invitacion_extranjeros.pdf
BOE_orden_1485_2021_regulacion_gestion_colectiva_contrataciones_origen.pdf
BOE_orden_1803_2011_importe_tasas_visados_doc_inmigracion_extranjeria.pdf
BOE_orden_3321_2011_expedicion_titulo_viaje_extranjeros.pdf
carta_derechos_fundamentales_UE.pdf
constitucion_espanola_es_cat.txt
Convencion_1951_estatuto_refugiados.pdf
decreto_1325_2003_24_octubre_r gimen_protecci n_temporal_personas_desplazadas.txt
derecho_internacional_sobre_migracion_glosario_migracion_OIM.pdf
determinaci n_responsable_examen_solicitudes_asilo_UE_15_06_1990.txt
directiva_2001_55_normas _m nimas_concesi n_protecci n_ temporal_afluencia_masiva_desplazados_UE.txt
directiva_2011_95_UE_normas_requisitos_reconocimiento_proteccion_internacional.pdf
directiva_2013_32_procedimiento_concesion_retirada_proteccion_internacional.pdf
directiva_2013_33_normas_acogida_solicitantes_proteccion_internacional.pdf
ficheros_incluidos.txt
guia_solicitantes_proteccion_internacional_en_italia.pdf
Ley_12_2009_30_10_reguladora_derecho_asilo_protecci n_subsidiaria.txt
Ley_de_Extranjer a_de_Espa a.pdf
ley_refugiados_asilados_venezuela.pdf
ley_refugiados_proteccion_complementaria_asilo_mexico.pdf
manual_derecho_europeo_asilo_fronteras_inmigracion_edicion_2020.pdf
policia_nacional_doc_solicitud_asilo_refugio.txt
politica_asilo_UE.pdf
protecci n_internacional.txt
reglamento_UE_L00031-00059_responsabilidad_examen_solicitud_proteccion.pdf
### Personal and Sensitive Information
No se incluye información personal o sensible.
---
No personal or sensitive information included.
## Considerations for Using the Data
### Social Impact of Dataset
Este corpus contribuye al desarrollo de modelos lingüísticos en español.
---
This corpus contributes to the development of language models in Spanish.
### Discussion of Biases
No postprocessing steps were applied to mitigate potential social biases.
## Licensing information
This work is licensed under [Apache License Version 2.0, January 2004](https://www.apache.org/licenses/LICENSE-2.0) License.
## Citation Information
```
```
## Contributions
[N/A] |
Genesius/llama2-gunung | ---
license: apache-2.0
---
|
mtkinit/dataset22666 | ---
pretty_name: dataset22666
tags:
- uci
- dataset 66
---
# dataset22666
Created from AIOD platform |
germank/hh-generated_flan_t5_large_with_features2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: 'biased:'
dtype: int64
- name: easy-to-understand
dtype: int64
- name: enough-detail
dtype: int64
- name: factuality
dtype: int64
- name: fail-to-consider-context
dtype: int64
- name: fail-to-consider-individual-preferences
dtype: int64
- name: helpfulness
dtype: int64
- name: intent
dtype: int64
- name: readability
dtype: int64
- name: relevance
dtype: int64
- name: repetetive
dtype: int64
- name: specificity
dtype: int64
- name: too-long
dtype: int64
splits:
- name: train
num_bytes: 395548
num_examples: 1600
download_size: 76218
dataset_size: 395548
---
# Dataset Card for "hh-generated_flan_t5_large_with_features2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_qqp_existential_it | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 330798
num_examples: 1870
- name: test
num_bytes: 3260041
num_examples: 18707
- name: train
num_bytes: 2875789
num_examples: 16269
download_size: 4020256
dataset_size: 6466628
---
# Dataset Card for "MULTI_VALUE_qqp_existential_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malteos/oscar-stats | ---
license: apache-2.0
language:
- en
---
Most common domains in [Colossal OSCAR v1](https://huggingface.co/datasets/oscar-corpus/colossal-oscar-1.0) (English subset) according to tokens.
Tokens are estimated based on total bytes and the [BLOOM tokenizer](https://huggingface.co/bigscience/bloom). |
nerdie01/emotions-modified | ---
license: apache-2.0
---
|
saurabh1896/OMR-scanned-documents | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 8217916.0
num_examples: 36
download_size: 8174461
dataset_size: 8217916.0
---
A medical forms dataset containing scanned documents is a valuable resource for healthcare professionals, researchers, and institutions seeking to streamline and improve their administrative and patient care processes. This dataset comprises digitized versions of various medical forms, such as patient intake forms, consent forms, health assessment questionnaires, and more, which have been scanned for electronic storage and easy access.
These scanned medical forms preserve the layout and structure of the original paper documents, including checkboxes, text fields, and signature spaces. Researchers and healthcare organizations can leverage this dataset to develop automated data extraction solutions, electronic health record (EHR) systems, and machine learning models for tasks like form recognition, data validation, and patient data management.
Additionally, this dataset serves as a valuable training and evaluation resource for image processing and optical character recognition (OCR) algorithms, enhancing the accuracy and efficiency of document digitization efforts within the healthcare sector. With the potential to improve data accuracy, reduce administrative burdens, and enhance patient care, the medical forms dataset with scanned documents is a cornerstone for advancing healthcare data management and accessibility.
|
Maiia/mcphrasy_test_skill_tok | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: query_pos
dtype: int64
- name: phrase
dtype: string
splits:
- name: train
num_bytes: 583858889
num_examples: 3001935
download_size: 240587059
dataset_size: 583858889
---
# Dataset Card for "mcphrasy_test_skill_tok"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NLUHOPOE__test-case-2 | ---
pretty_name: Evaluation run of NLUHOPOE/test-case-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NLUHOPOE/test-case-2](https://huggingface.co/NLUHOPOE/test-case-2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NLUHOPOE__test-case-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T11:04:43.653621](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-2/blob/main/results_2024-02-20T11-04-43.653621.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5618046040551106,\n\
\ \"acc_stderr\": 0.03369255677374868,\n \"acc_norm\": 0.5657680253794599,\n\
\ \"acc_norm_stderr\": 0.034384024414576136,\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.49591167596042596,\n\
\ \"mc2_stderr\": 0.015277273398245531\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5264505119453925,\n \"acc_stderr\": 0.01459093135812017,\n\
\ \"acc_norm\": 0.5853242320819113,\n \"acc_norm_stderr\": 0.014397070564409174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6014738099980084,\n\
\ \"acc_stderr\": 0.004885942040894565,\n \"acc_norm\": 0.7939653455486955,\n\
\ \"acc_norm_stderr\": 0.0040362906027860595\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.630188679245283,\n \"acc_stderr\": 0.029711421880107936,\n\
\ \"acc_norm\": 0.630188679245283,\n \"acc_norm_stderr\": 0.029711421880107936\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.5260115606936416,\n \"acc_stderr\": 0.03807301726504513,\n\
\ \"acc_norm\": 0.5260115606936416,\n \"acc_norm_stderr\": 0.03807301726504513\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n\
\ \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n\
\ \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n\
\ \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n\
\ \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.40350877192982454,\n \"acc_stderr\": 0.046151869625837026,\n\
\ \"acc_norm\": 0.40350877192982454,\n \"acc_norm_stderr\": 0.046151869625837026\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n \"\
acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.02459497512892094,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02459497512892094\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.35161290322580646,\n \"acc_stderr\": 0.027162537826948458,\n \"\
acc_norm\": 0.35161290322580646,\n \"acc_norm_stderr\": 0.027162537826948458\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713548,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713548\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4512820512820513,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.4512820512820513,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145658,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145658\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096625,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096625\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.01787121776779024,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.01787121776779024\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693264,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693264\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906945,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709588,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709588\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.025992472029306386,\n\
\ \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.025992472029306386\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3240223463687151,\n\
\ \"acc_stderr\": 0.015652542496421132,\n \"acc_norm\": 0.3240223463687151,\n\
\ \"acc_norm_stderr\": 0.015652542496421132\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.02758281141515961,\n\
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.02758281141515961\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.02646248777700187,\n\
\ \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.02646248777700187\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596147,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38852672750977835,\n\
\ \"acc_stderr\": 0.012448817838292351,\n \"acc_norm\": 0.38852672750977835,\n\
\ \"acc_norm_stderr\": 0.012448817838292351\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003483,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003483\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33659730722154224,\n\
\ \"mc1_stderr\": 0.016542412809494887,\n \"mc2\": 0.49591167596042596,\n\
\ \"mc2_stderr\": 0.015277273398245531\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855575\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38817285822592873,\n \
\ \"acc_stderr\": 0.013423607564002743\n }\n}\n```"
repo_url: https://huggingface.co/NLUHOPOE/test-case-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|arc:challenge|25_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|arc:challenge|25_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|gsm8k|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|gsm8k|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hellaswag|10_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hellaswag|10_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T10-53-46.187180.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T11-04-43.653621.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T11-04-43.653621.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- '**/details_harness|winogrande|5_2024-02-20T10-53-46.187180.parquet'
- split: 2024_02_20T11_04_43.653621
path:
- '**/details_harness|winogrande|5_2024-02-20T11-04-43.653621.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T11-04-43.653621.parquet'
- config_name: results
data_files:
- split: 2024_02_20T10_53_46.187180
path:
- results_2024-02-20T10-53-46.187180.parquet
- split: 2024_02_20T11_04_43.653621
path:
- results_2024-02-20T11-04-43.653621.parquet
- split: latest
path:
- results_2024-02-20T11-04-43.653621.parquet
---
# Dataset Card for Evaluation run of NLUHOPOE/test-case-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NLUHOPOE/test-case-2](https://huggingface.co/NLUHOPOE/test-case-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NLUHOPOE__test-case-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T11:04:43.653621](https://huggingface.co/datasets/open-llm-leaderboard/details_NLUHOPOE__test-case-2/blob/main/results_2024-02-20T11-04-43.653621.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5618046040551106,
"acc_stderr": 0.03369255677374868,
"acc_norm": 0.5657680253794599,
"acc_norm_stderr": 0.034384024414576136,
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.49591167596042596,
"mc2_stderr": 0.015277273398245531
},
"harness|arc:challenge|25": {
"acc": 0.5264505119453925,
"acc_stderr": 0.01459093135812017,
"acc_norm": 0.5853242320819113,
"acc_norm_stderr": 0.014397070564409174
},
"harness|hellaswag|10": {
"acc": 0.6014738099980084,
"acc_stderr": 0.004885942040894565,
"acc_norm": 0.7939653455486955,
"acc_norm_stderr": 0.0040362906027860595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.630188679245283,
"acc_stderr": 0.029711421880107936,
"acc_norm": 0.630188679245283,
"acc_norm_stderr": 0.029711421880107936
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504513,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504513
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.046151869625837026,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.046151869625837026
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02459497512892094,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02459497512892094
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.35161290322580646,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.35161290322580646,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713548,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713548
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4512820512820513,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.4512820512820513,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145658,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145658
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096625,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096625
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.01787121776779024,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.01787121776779024
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906945,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709588,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709588
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.025992472029306386,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.025992472029306386
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3240223463687151,
"acc_stderr": 0.015652542496421132,
"acc_norm": 0.3240223463687151,
"acc_norm_stderr": 0.015652542496421132
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.02758281141515961,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.02758281141515961
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.02646248777700187,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.02646248777700187
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596147,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38852672750977835,
"acc_stderr": 0.012448817838292351,
"acc_norm": 0.38852672750977835,
"acc_norm_stderr": 0.012448817838292351
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33659730722154224,
"mc1_stderr": 0.016542412809494887,
"mc2": 0.49591167596042596,
"mc2_stderr": 0.015277273398245531
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855575
},
"harness|gsm8k|5": {
"acc": 0.38817285822592873,
"acc_stderr": 0.013423607564002743
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
FinchResearch/OpenPlatypus-Alpaca | ---
license: apache-2.0
size_categories:
- 10K<n<100K
---
### A merged dataset...
### Open-Platypus & Alpaca Data |
vvtq/control_train_4k | ---
dataset_info:
features:
- name: image
dtype: image
- name: noised
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 5589119440.0
num_examples: 3924
download_size: 3500038747
dataset_size: 5589119440.0
---
# Dataset Card for "control_train_4k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MichaelJH/Ryu-AI.datadict | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
sequence: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 7090578
num_examples: 21460
download_size: 1783799
dataset_size: 7090578
---
# Dataset Card for "Ryu-AI.datadict"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
semiotic/spider_original | ---
dataset_info:
features:
- name: type
dtype: string
- name: question
dtype: string
- name: query
dtype: string
- name: db_id
dtype: string
- name: schema
dtype: string
splits:
- name: train
num_bytes: 9331918
num_examples: 7000
- name: val
num_bytes: 1041190
num_examples: 1034
- name: test
num_bytes: 2178834
num_examples: 2147
download_size: 644206
dataset_size: 12551942
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
OpenDILabCommunity/fake_browser_state_zoo | ---
license: apache-2.0
---
|
shuvom/red_dot | ---
license: mit
---
|
knkarthick/dialogsum | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license: cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- summarization
- text2text-generation
- text-generation
task_ids: []
pretty_name: DIALOGSum Corpus
tags:
- dialogue-summary
- one-liner-summary
- meeting-title
- email-subject
---
# Dataset Card for DIALOGSum Corpus
## Dataset Description
### Links
- **Homepage:** https://aclanthology.org/2021.findings-acl.449
- **Repository:** https://github.com/cylnlp/dialogsum
- **Paper:** https://aclanthology.org/2021.findings-acl.449
- **Point of Contact:** https://huggingface.co/knkarthick
### Dataset Summary
DialogSum is a large-scale dialogue summarization dataset, consisting of 13,460 (Plus 100 holdout data for topic generation) dialogues with corresponding manually labeled summaries and topics.
### Languages
English
## Dataset Structure
### Data Instances
DialogSum is a large-scale dialogue summarization dataset, consisting of 13,460 dialogues (+1000 tests) split into train, test and validation.
The first instance in the training set:
{'id': 'train_0', 'summary': "Mr. Smith's getting a check-up, and Doctor Hawkins advises him to have one every year. Hawkins'll give some information about their classes and medications to help Mr. Smith quit smoking.", 'dialogue': "#Person1#: Hi, Mr. Smith. I'm Doctor Hawkins. Why are you here today?\n#Person2#: I found it would be a good idea to get a check-up.\n#Person1#: Yes, well, you haven't had one for 5 years. You should have one every year.\n#Person2#: I know. I figure as long as there is nothing wrong, why go see the doctor?\n#Person1#: Well, the best way to avoid serious illnesses is to find out about them early. So try to come at least once a year for your own good.\n#Person2#: Ok.\n#Person1#: Let me see here. Your eyes and ears look fine. Take a deep breath, please. Do you smoke, Mr. Smith?\n#Person2#: Yes.\n#Person1#: Smoking is the leading cause of lung cancer and heart disease, you know. You really should quit.\n#Person2#: I've tried hundreds of times, but I just can't seem to kick the habit.\n#Person1#: Well, we have classes and some medications that might help. I'll give you more information before you leave.\n#Person2#: Ok, thanks doctor.", 'topic': "get a check-up}
### Data Fields
- dialogue: text of dialogue.
- summary: human written summary of the dialogue.
- topic: human written topic/one liner of the dialogue.
- id: unique file id of an example.
### Data Splits
- train: 12460
- val: 500
- test: 1500
- holdout: 100 [Only 3 features: id, dialogue, topic]
## Dataset Creation
### Curation Rationale
In paper:
We collect dialogue data for DialogSum from three public dialogue corpora, namely Dailydialog (Li et al., 2017), DREAM (Sun et al., 2019) and MuTual (Cui et al., 2019), as well as an English speaking practice website. These datasets contain face-to-face spoken dialogues that cover a wide range of daily-life topics, including schooling, work, medication, shopping, leisure, travel. Most conversations take place between friends, colleagues, and between service providers and customers.
Compared with previous datasets, dialogues from DialogSum have distinct characteristics:
Under rich real-life scenarios, including more diverse task-oriented scenarios;
Have clear communication patterns and intents, which is valuable to serve as summarization sources;
Have a reasonable length, which comforts the purpose of automatic summarization.
We ask annotators to summarize each dialogue based on the following criteria:
Convey the most salient information;
Be brief;
Preserve important named entities within the conversation;
Be written from an observer perspective;
Be written in formal language.
### Who are the source language producers?
linguists
### Who are the annotators?
language experts
## Licensing Information
CC BY-NC-SA 4.0
## Citation Information
```
@inproceedings{chen-etal-2021-dialogsum,
title = "{D}ialog{S}um: {A} Real-Life Scenario Dialogue Summarization Dataset",
author = "Chen, Yulong and
Liu, Yang and
Chen, Liang and
Zhang, Yue",
booktitle = "Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021",
month = aug,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2021.findings-acl.449",
doi = "10.18653/v1/2021.findings-acl.449",
pages = "5062--5074",
```
## Contributions
Thanks to [@cylnlp](https://github.com/cylnlp) for adding this dataset. |
mstz/adult | ---
language:
- en
tags:
- adult
- tabular_classification
- binary_classification
- multiclass_classification
- UCI
pretty_name: Adult
size_categories:
- 10K<n<100K
task_categories:
- tabular-classification
configs:
- encoding
- income
- income-no race
- race
license: cc
---
# Adult
The [Adult dataset](https://archive.ics.uci.edu/ml/datasets/Adult) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Census dataset including personal characteristic of a person, and their income threshold.
# Configurations and tasks
| **Configuration** | **Task** | Description |
|-------------------|---------------------------|-----------------------------------------------------------------|
| encoding | | Encoding dictionary showing original values of encoded features.|
| income | Binary classification | Classify the person's income as over or under the threshold. |
| income-no race | Binary classification | As `income`, but the `race` feature is removed. |
| race | Multiclass classification | Predict the race of the individual. |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/adult", "income")["train"]
```
# Features
Target feature changes according to the selected configuration and is always in last position in the dataset.
|**Feature** |**Type** | **Description** |
|-------------------------------|-----------|------------------------------------------------------------|
|`age` |`[int64]` | Age of the person. |
|`capital_gain` |`[float64]`| Capital gained by the person. |
|`capital_loss` |`[float64]`| Capital lost by the person. |
|`education` |`[int8]` | Education level: the higher, the more educated the person. |
|`final_weight` |`[int64]` | |
|`hours_worked_per_week` |`[int64]` | Hours worked per week. |
|`marital_status` |`[string]` | Marital status of the person. |
|`native_country` |`[string]` | Native country of the person. |
|`occupation` |`[string]` | Job of the person. |
|`race` |`[string]` | Race of the person. |
|`relationship` |`[string]` | |
|`is_male` |`[bool]` | Man/Woman. |
|`workclass` |`[string]` | Type of job of the person. |
|**over_threshold** |`int8` | `1` for income `>= 50k$`, `0` otherwise. | |
Sagar12/text2sql | ---
license: unknown
---
|
Birchlabs/openai-prm800k-phase1_train-stepwise-best | ---
license: mit
---
|
HuggingFaceH4/summarize-from-feedback | ---
dataset_info:
features:
- name: meta
struct:
- name: id
dtype: string
- name: post
dtype: string
- name: title
dtype: string
- name: subreddit
dtype: string
- name: site
dtype: string
- name: article
dtype: string
- name: responses
list:
- name: text
dtype: string
- name: policy
dtype: string
- name: note
dtype: string
- name: label
dtype: int32
- name: worker
dtype: string
- name: batch
dtype: string
- name: split
dtype: string
- name: extra
struct:
- name: confidence
dtype: int32
splits:
- name: train
num_bytes: 172539153
num_examples: 92858
- name: validation
num_bytes: 170579710
num_examples: 86086
download_size: 43943406
dataset_size: 343118863
---
# Dataset Card for "summarize-from-feedback"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HamdanXI/paradetox_editOps_preprocess | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: en_toxic_comment
dtype: string
- name: en_neutral_comment
dtype: string
- name: edit_ops
sequence:
sequence: string
- name: masked_comment
dtype: string
splits:
- name: train
num_bytes: 5469950
num_examples: 19744
download_size: 0
dataset_size: 5469950
---
# Dataset Card for "paradetox_editOps_preprocess"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_soniox__Soniox-7B-v1.0 | ---
pretty_name: Evaluation run of soniox/Soniox-7B-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [soniox/Soniox-7B-v1.0](https://huggingface.co/soniox/Soniox-7B-v1.0) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_soniox__Soniox-7B-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-19T23:00:10.357078](https://huggingface.co/datasets/open-llm-leaderboard/details_soniox__Soniox-7B-v1.0/blob/main/results_2024-01-19T23-00-10.357078.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6439461862864996,\n\
\ \"acc_stderr\": 0.032256074282620416,\n \"acc_norm\": 0.6467921625949077,\n\
\ \"acc_norm_stderr\": 0.032899600410563404,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5384060653321814,\n\
\ \"mc2_stderr\": 0.015406940325739558\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.014258563880513785,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.632244572794264,\n\
\ \"acc_stderr\": 0.004812088620277182,\n \"acc_norm\": 0.8255327623979287,\n\
\ \"acc_norm_stderr\": 0.0037873515193708137\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933713,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268528,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268528\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063548,\n \
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063548\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8270042194092827,\n \"acc_stderr\": 0.024621562866768427,\n \
\ \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.024621562866768427\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.0349260647662379,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.0349260647662379\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281376,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281376\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001505,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001505\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.02468531686725781,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.02468531686725781\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.016384638410380823,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.016384638410380823\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079067,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079067\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5384060653321814,\n\
\ \"mc2_stderr\": 0.015406940325739558\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5625473843821076,\n \
\ \"acc_stderr\": 0.013664299060751915\n }\n}\n```"
repo_url: https://huggingface.co/soniox/Soniox-7B-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|arc:challenge|25_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|gsm8k|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hellaswag|10_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-19T23-00-10.357078.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- '**/details_harness|winogrande|5_2024-01-19T23-00-10.357078.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-19T23-00-10.357078.parquet'
- config_name: results
data_files:
- split: 2024_01_19T23_00_10.357078
path:
- results_2024-01-19T23-00-10.357078.parquet
- split: latest
path:
- results_2024-01-19T23-00-10.357078.parquet
---
# Dataset Card for Evaluation run of soniox/Soniox-7B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [soniox/Soniox-7B-v1.0](https://huggingface.co/soniox/Soniox-7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_soniox__Soniox-7B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-19T23:00:10.357078](https://huggingface.co/datasets/open-llm-leaderboard/details_soniox__Soniox-7B-v1.0/blob/main/results_2024-01-19T23-00-10.357078.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6439461862864996,
"acc_stderr": 0.032256074282620416,
"acc_norm": 0.6467921625949077,
"acc_norm_stderr": 0.032899600410563404,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5384060653321814,
"mc2_stderr": 0.015406940325739558
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.014258563880513785,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175452
},
"harness|hellaswag|10": {
"acc": 0.632244572794264,
"acc_stderr": 0.004812088620277182,
"acc_norm": 0.8255327623979287,
"acc_norm_stderr": 0.0037873515193708137
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933713,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268528,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063548,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063548
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.024621562866768427,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.024621562866768427
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.0349260647662379,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.0349260647662379
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281376,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281376
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001505,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001505
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.02468531686725781,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.02468531686725781
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.016384638410380823,
"acc_norm": 0.4,
"acc_norm_stderr": 0.016384638410380823
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079067,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5384060653321814,
"mc2_stderr": 0.015406940325739558
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.5625473843821076,
"acc_stderr": 0.013664299060751915
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_mrpc_indefinite_for_definite_articles | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 380763
num_examples: 1431
- name: train
num_bytes: 796596
num_examples: 2976
- name: validation
num_bytes: 92634
num_examples: 342
download_size: 848350
dataset_size: 1269993
---
# Dataset Card for "MULTI_VALUE_mrpc_indefinite_for_definite_articles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.