datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Sharka/DocVQA_layoutLM | ---
dataset_info:
features:
- name: image
sequence:
sequence:
sequence: uint8
- name: answers
sequence: string
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: bbox
sequence:
sequence: int64
- name: start_positions
dtype: int64
- name: end_positions
dtype: int64
- name: questions
dtype: string
splits:
- name: train
num_bytes: 6674557036
num_examples: 38174
- name: validation
num_bytes: 882472789
num_examples: 5047
download_size: 2458338968
dataset_size: 7557029825
---
# Dataset Card for "DocVQA_layoutLM"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MrGonxo13/oraciones_yuxtapuestas | ---
license: cc-by-4.0
---
|
ura-hcmut/zalo_e2eqa-dpo | ---
license: mit
language:
- vi
size_categories:
- n<1K
configs:
- config_name: default
data_files:
- split: test
path: zalo_e2eqa-dpo.json
--- |
Oysiyl/google-android-toy-controlnet-canny | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1476353.0
num_examples: 15
download_size: 1458471
dataset_size: 1476353.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Danielbrdz__Barcenas-3b | ---
pretty_name: Evaluation run of Danielbrdz/Barcenas-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Danielbrdz/Barcenas-3b](https://huggingface.co/Danielbrdz/Barcenas-3b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__Barcenas-3b_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-19T09:57:40.626211](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-3b_public/blob/main/results_2023-11-19T09-57-40.626211.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.29842324440945045,\n\
\ \"acc_stderr\": 0.03223833363169239,\n \"acc_norm\": 0.3005672000487252,\n\
\ \"acc_norm_stderr\": 0.03303096158756811,\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.01522589934082684,\n \"mc2\": 0.4155719273070087,\n\
\ \"mc2_stderr\": 0.013997732355524069,\n \"em\": 0.0014681208053691276,\n\
\ \"em_stderr\": 0.0003921042190298658,\n \"f1\": 0.04693791946308727,\n\
\ \"f1_stderr\": 0.0011945909744697145\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3916382252559727,\n \"acc_stderr\": 0.014264122124938217,\n\
\ \"acc_norm\": 0.431740614334471,\n \"acc_norm_stderr\": 0.014474591427196204\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5013941445927106,\n\
\ \"acc_stderr\": 0.004989762014739189,\n \"acc_norm\": 0.6781517625970922,\n\
\ \"acc_norm_stderr\": 0.0046623033952396175\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2814814814814815,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.2814814814814815,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.33584905660377357,\n \"acc_stderr\": 0.029067220146644826,\n\
\ \"acc_norm\": 0.33584905660377357,\n \"acc_norm_stderr\": 0.029067220146644826\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554857,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554857\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.027136349602424063,\n\
\ \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.027136349602424063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.040287315329475604,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.040287315329475604\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.03268454013011742,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.03268454013011742\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117317,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117317\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n\
\ \"acc_stderr\": 0.02556060472102289,\n \"acc_norm\": 0.2806451612903226,\n\
\ \"acc_norm_stderr\": 0.02556060472102289\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2561576354679803,\n \"acc_stderr\": 0.030712730070982592,\n\
\ \"acc_norm\": 0.2561576354679803,\n \"acc_norm_stderr\": 0.030712730070982592\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268048,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268048\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.36363636363636365,\n \"acc_stderr\": 0.034273086529999344,\n \"\
acc_norm\": 0.36363636363636365,\n \"acc_norm_stderr\": 0.034273086529999344\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.034356961683613546,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.034356961683613546\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31092436974789917,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.31092436974789917,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26422018348623855,\n\
\ \"acc_stderr\": 0.018904164171510193,\n \"acc_norm\": 0.26422018348623855,\n\
\ \"acc_norm_stderr\": 0.018904164171510193\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3055555555555556,\n \"acc_stderr\": 0.03141554629402543,\n\
\ \"acc_norm\": 0.3055555555555556,\n \"acc_norm_stderr\": 0.03141554629402543\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138608,\n\
\ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138608\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2556053811659193,\n\
\ \"acc_stderr\": 0.029275891003969927,\n \"acc_norm\": 0.2556053811659193,\n\
\ \"acc_norm_stderr\": 0.029275891003969927\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.31297709923664124,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.31297709923664124,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.042032772914677614,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.042032772914677614\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2692307692307692,\n\
\ \"acc_stderr\": 0.029058588303748845,\n \"acc_norm\": 0.2692307692307692,\n\
\ \"acc_norm_stderr\": 0.029058588303748845\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29757343550446996,\n\
\ \"acc_stderr\": 0.016349111912909418,\n \"acc_norm\": 0.29757343550446996,\n\
\ \"acc_norm_stderr\": 0.016349111912909418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098443,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098443\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.026992544339297236,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.026992544339297236\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n\
\ \"acc_stderr\": 0.027368078243971625,\n \"acc_norm\": 0.3665594855305466,\n\
\ \"acc_norm_stderr\": 0.027368078243971625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3549382716049383,\n \"acc_stderr\": 0.026624152478845853,\n\
\ \"acc_norm\": 0.3549382716049383,\n \"acc_norm_stderr\": 0.026624152478845853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.28226857887874834,\n\
\ \"acc_stderr\": 0.011495852176241952,\n \"acc_norm\": 0.28226857887874834,\n\
\ \"acc_norm_stderr\": 0.011495852176241952\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.01777694715752803,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.01777694715752803\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072775,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072775\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.32338308457711445,\n\
\ \"acc_stderr\": 0.03307615947979033,\n \"acc_norm\": 0.32338308457711445,\n\
\ \"acc_norm_stderr\": 0.03307615947979033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n\
\ \"acc_stderr\": 0.034843315926805875,\n \"acc_norm\": 0.27710843373493976,\n\
\ \"acc_norm_stderr\": 0.034843315926805875\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2533659730722154,\n\
\ \"mc1_stderr\": 0.01522589934082684,\n \"mc2\": 0.4155719273070087,\n\
\ \"mc2_stderr\": 0.013997732355524069\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6621941594317285,\n \"acc_stderr\": 0.013292583502910892\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \
\ \"em_stderr\": 0.0003921042190298658,\n \"f1\": 0.04693791946308727,\n\
\ \"f1_stderr\": 0.0011945909744697145\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.025018953752843062,\n \"acc_stderr\": 0.00430204504656428\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Danielbrdz/Barcenas-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|arc:challenge|25_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|drop|3_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|gsm8k|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hellaswag|10_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T09-57-40.626211.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-19T09-57-40.626211.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- '**/details_harness|winogrande|5_2023-11-19T09-57-40.626211.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-19T09-57-40.626211.parquet'
- config_name: results
data_files:
- split: 2023_11_19T09_57_40.626211
path:
- results_2023-11-19T09-57-40.626211.parquet
- split: latest
path:
- results_2023-11-19T09-57-40.626211.parquet
---
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Danielbrdz/Barcenas-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-3b](https://huggingface.co/Danielbrdz/Barcenas-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__Barcenas-3b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-19T09:57:40.626211](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-3b_public/blob/main/results_2023-11-19T09-57-40.626211.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.29842324440945045,
"acc_stderr": 0.03223833363169239,
"acc_norm": 0.3005672000487252,
"acc_norm_stderr": 0.03303096158756811,
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082684,
"mc2": 0.4155719273070087,
"mc2_stderr": 0.013997732355524069,
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298658,
"f1": 0.04693791946308727,
"f1_stderr": 0.0011945909744697145
},
"harness|arc:challenge|25": {
"acc": 0.3916382252559727,
"acc_stderr": 0.014264122124938217,
"acc_norm": 0.431740614334471,
"acc_norm_stderr": 0.014474591427196204
},
"harness|hellaswag|10": {
"acc": 0.5013941445927106,
"acc_stderr": 0.004989762014739189,
"acc_norm": 0.6781517625970922,
"acc_norm_stderr": 0.0046623033952396175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.33584905660377357,
"acc_stderr": 0.029067220146644826,
"acc_norm": 0.33584905660377357,
"acc_norm_stderr": 0.029067220146644826
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554857,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554857
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.027136349602424063,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.027136349602424063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.040287315329475604,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.040287315329475604
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.03268454013011742,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.03268454013011742
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117317,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117317
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2806451612903226,
"acc_stderr": 0.02556060472102289,
"acc_norm": 0.2806451612903226,
"acc_norm_stderr": 0.02556060472102289
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2561576354679803,
"acc_stderr": 0.030712730070982592,
"acc_norm": 0.2561576354679803,
"acc_norm_stderr": 0.030712730070982592
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268048,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268048
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.36363636363636365,
"acc_stderr": 0.034273086529999344,
"acc_norm": 0.36363636363636365,
"acc_norm_stderr": 0.034273086529999344
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.034356961683613546,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.034356961683613546
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31092436974789917,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.31092436974789917,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26422018348623855,
"acc_stderr": 0.018904164171510193,
"acc_norm": 0.26422018348623855,
"acc_norm_stderr": 0.018904164171510193
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.03141554629402543,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.03141554629402543
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.029936696387138608,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.029936696387138608
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2556053811659193,
"acc_stderr": 0.029275891003969927,
"acc_norm": 0.2556053811659193,
"acc_norm_stderr": 0.029275891003969927
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.31297709923664124,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.31297709923664124,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.042032772914677614,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.042032772914677614
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2692307692307692,
"acc_stderr": 0.029058588303748845,
"acc_norm": 0.2692307692307692,
"acc_norm_stderr": 0.029058588303748845
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29757343550446996,
"acc_stderr": 0.016349111912909418,
"acc_norm": 0.29757343550446996,
"acc_norm_stderr": 0.016349111912909418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098443,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098443
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.026992544339297236,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.026992544339297236
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3665594855305466,
"acc_stderr": 0.027368078243971625,
"acc_norm": 0.3665594855305466,
"acc_norm_stderr": 0.027368078243971625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3549382716049383,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.3549382716049383,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.28226857887874834,
"acc_stderr": 0.011495852176241952,
"acc_norm": 0.28226857887874834,
"acc_norm_stderr": 0.011495852176241952
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.01777694715752803,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.01777694715752803
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072775,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072775
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.4,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.32338308457711445,
"acc_stderr": 0.03307615947979033,
"acc_norm": 0.32338308457711445,
"acc_norm_stderr": 0.03307615947979033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.034843315926805875,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.034843315926805875
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2533659730722154,
"mc1_stderr": 0.01522589934082684,
"mc2": 0.4155719273070087,
"mc2_stderr": 0.013997732355524069
},
"harness|winogrande|5": {
"acc": 0.6621941594317285,
"acc_stderr": 0.013292583502910892
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298658,
"f1": 0.04693791946308727,
"f1_stderr": 0.0011945909744697145
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.00430204504656428
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ksabeh/openbrand-zs | ---
dataset_info:
features:
- name: category
dtype: string
- name: title
dtype: string
- name: brand
dtype: string
- name: asin
dtype: string
- name: imageURL
dtype: string
- name: position_index
dtype: int64
- name: num_tokens
dtype: int64
- name: title_length
dtype: int64
- name: title_category
dtype: string
splits:
- name: train
num_bytes: 24211621
num_examples: 61075
- name: val
num_bytes: 2685833
num_examples: 6788
- name: test
num_bytes: 9453851
num_examples: 25221
- name: electronics
num_bytes: 2423259
num_examples: 4786
- name: sports
num_bytes: 1904597
num_examples: 5420
- name: toys
num_bytes: 2078207
num_examples: 6329
- name: automotive
num_bytes: 2271017
num_examples: 6446
- name: grocery
num_bytes: 776771
num_examples: 2240
download_size: 13092616
dataset_size: 45805156
---
# Dataset Card for "openbrand-zs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_1712995629 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2714731
num_examples: 6670
download_size: 1347496
dataset_size: 2714731
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jordanfan/processed_us_congress_117_bills_v2 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: id
dtype: string
- name: policy_areas
dtype: string
- name: cur_summary
dtype: string
- name: cur_text
dtype: string
- name: title
dtype: string
- name: titles_official
dtype: string
- name: titles_short
dtype: string
- name: sponsor_name
dtype: string
- name: sponsor_party
dtype: string
- name: sponsor_state
dtype: string
- name: cleaned_summary
dtype: string
- name: extracted_text
dtype: string
- name: extracted_text_375
dtype: string
- name: extracted_text_750
dtype: string
- name: extracted_text_1000
dtype: string
splits:
- name: train
num_bytes: 371187381
num_examples: 11277
- name: val
num_bytes: 107701147
num_examples: 3388
- name: test
num_bytes: 17678977
num_examples: 377
download_size: 204648744
dataset_size: 496567505
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
CyberHarem/necron_misha_maougakuinnofutekigousha | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Necron Misha/ミーシャ・ネクロン (Maou Gakuin no Futekigousha)
This is the dataset of Necron Misha/ミーシャ・ネクロン (Maou Gakuin no Futekigousha), containing 523 images and their tags.
The core tags of this character are `hair_ornament, sidelocks, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 523 | 343.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/necron_misha_maougakuinnofutekigousha/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 523 | 343.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/necron_misha_maougakuinnofutekigousha/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 987 | 592.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/necron_misha_maougakuinnofutekigousha/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/necron_misha_maougakuinnofutekigousha',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, long_sleeves, short_hair_with_long_locks, solo, standing, white_thighhighs, capelet, from_side, zettai_ryouiki, profile, white_hair, tree, white_dress |
| 1 | 7 |  |  |  |  |  | 1girl, black_necktie, short_hair_with_long_locks, solo, white_dress, purple_hair, long_sleeves, looking_at_viewer, medium_breasts, expressionless, grey_eyes, upper_body |
| 2 | 8 |  |  |  |  |  | 1girl, black_necktie, short_hair_with_long_locks, solo, upper_body, blue_eyes, hair_between_eyes, white_shirt, closed_mouth, medium_breasts, long_hair |
| 3 | 6 |  |  |  |  |  | 1girl, black_necktie, short_hair_with_long_locks, solo, upper_body, blue_eyes, medium_breasts, sitting, closed_mouth |
| 4 | 5 |  |  |  |  |  | 1girl, closed_mouth, expressionless, portrait, short_hair_with_long_locks, solo, looking_at_viewer |
| 5 | 6 |  |  |  |  |  | 1girl, cloud, day, portrait, solo, blue_eyes, blue_sky, hair_between_eyes, white_hair, closed_mouth, green_eyes, short_hair_with_long_locks, school_uniform |
| 6 | 10 |  |  |  |  |  | 1girl, collared_shirt, short_hair_with_long_locks, solo, white_shirt, closed_mouth, jacket, upper_body, neck_ribbon, school_uniform, green_eyes, looking_at_viewer, gem, expressionless |
| 7 | 5 |  |  |  |  |  | 1girl, jacket, solo, upper_body, closed_eyes, closed_mouth, profile, short_hair_with_long_locks, smile, from_behind, holding, purple_hair |
| 8 | 6 |  |  |  |  |  | 1girl, blush, glowing, green_eyes, solo, ahoge, closed_mouth, long_hair, looking_at_viewer, bodysuit |
| 9 | 6 |  |  |  |  |  | 1girl, collared_shirt, outdoors, short_hair_with_long_locks, white_shirt, blue_eyes, jacket, long_sleeves, park_bench, solo, tree, white_hair, ring, school_uniform, sitting |
| 10 | 6 |  |  |  |  |  | 2girls, from_side, profile, closed_mouth, green_eyes, portrait, solo_focus, blonde_hair, short_hair_with_long_locks, white_hair |
| 11 | 6 |  |  |  |  |  | 1girl, blue_eyes, from_behind, long_sleeves, looking_at_viewer, looking_back, sleeves_past_fingers, solo, white_hair, short_hair_with_long_locks, closed_mouth, white_dress |
| 12 | 6 |  |  |  |  |  | 2girls, close-up, profile, blonde_hair, long_hair, yuri, from_side, looking_at_another, white_hair, closed_mouth, open_mouth |
| 13 | 5 |  |  |  |  |  | 2girls, fingerless_gloves, green_eyes, long_hair, brown_hair, long_sleeves, short_hair, 1boy, black_gloves, black_hair, brick_wall, tears, yuri |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | short_hair_with_long_locks | solo | standing | white_thighhighs | capelet | from_side | zettai_ryouiki | profile | white_hair | tree | white_dress | black_necktie | purple_hair | looking_at_viewer | medium_breasts | expressionless | grey_eyes | upper_body | blue_eyes | hair_between_eyes | white_shirt | closed_mouth | long_hair | sitting | portrait | cloud | day | blue_sky | green_eyes | school_uniform | collared_shirt | jacket | neck_ribbon | gem | closed_eyes | smile | from_behind | holding | blush | glowing | ahoge | bodysuit | outdoors | park_bench | ring | 2girls | solo_focus | blonde_hair | looking_back | sleeves_past_fingers | close-up | yuri | looking_at_another | open_mouth | fingerless_gloves | brown_hair | short_hair | 1boy | black_gloves | black_hair | brick_wall | tears |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:---------------|:-----------------------------|:-------|:-----------|:-------------------|:----------|:------------|:-----------------|:----------|:-------------|:-------|:--------------|:----------------|:--------------|:--------------------|:-----------------|:-----------------|:------------|:-------------|:------------|:--------------------|:--------------|:---------------|:------------|:----------|:-----------|:--------|:------|:-----------|:-------------|:-----------------|:-----------------|:---------|:--------------|:------|:--------------|:--------|:--------------|:----------|:--------|:----------|:--------|:-----------|:-----------|:-------------|:-------|:---------|:-------------|:--------------|:---------------|:-----------------------|:-----------|:-------|:---------------------|:-------------|:--------------------|:-------------|:-------------|:-------|:---------------|:-------------|:-------------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 8 |  |  |  |  |  | X | | X | X | | | | | | | | | | X | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | X | | | | | | | | | | X | | | X | | | X | X | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | X | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | X | | | | | | | X | | | | | | | | | | X | X | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 10 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | X | | X | | X | | | X | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | X | | | | | | X | | | | | X | | | | | X | | | | X | | | | | | | | | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | X | | | | | | | | | | | | X | | | | | | | | X | X | | | | | | X | | | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | |
| 9 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | X | X | | | | | | | | | X | | X | | | X | | | | | | X | X | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | |
| 10 | 6 |  |  |  |  |  | | | X | | | | | X | | X | X | | | | | | | | | | | | | X | | | X | | | | X | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | |
| 11 | 6 |  |  |  |  |  | X | X | X | X | | | | | | | X | | X | | | X | | | | | X | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | X | | | | | | | | | | | | |
| 12 | 6 |  |  |  |  |  | | | | | | | | X | | X | X | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | X | X | X | X | | | | | | | | |
| 13 | 5 |  |  |  |  |  | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | X | | | | | | X | | | X | X | X | X | X | X | X | X |
|
themanas021/MATH-Algebra | ---
license: mit
---
|
NeelNanda/pile-10k | ---
license: bigscience-bloom-rail-1.0
---
The first 10K elements of [The Pile](https://pile.eleuther.ai/), useful for debugging models trained on it. See the [HuggingFace page for the full Pile](https://huggingface.co/datasets/the_pile) for more info. Inspired by [stas' great resource](https://huggingface.co/datasets/stas/openwebtext-10k) doing the same for OpenWebText |
open-llm-leaderboard/details_runkai__PascalHermes-2.5-Mistral-7B | ---
pretty_name: Evaluation run of runkai/PascalHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [runkai/PascalHermes-2.5-Mistral-7B](https://huggingface.co/runkai/PascalHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_runkai__PascalHermes-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T00:53:29.582984](https://huggingface.co/datasets/open-llm-leaderboard/details_runkai__PascalHermes-2.5-Mistral-7B/blob/main/results_2024-03-01T00-53-29.582984.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6224846504972104,\n\
\ \"acc_stderr\": 0.03249363469713295,\n \"acc_norm\": 0.6261443120868132,\n\
\ \"acc_norm_stderr\": 0.03313930679039025,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5372446307638827,\n\
\ \"mc2_stderr\": 0.015160471880409516\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938215,\n\
\ \"acc_norm\": 0.6382252559726962,\n \"acc_norm_stderr\": 0.014041957945038083\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6430989842660825,\n\
\ \"acc_stderr\": 0.004781061390873913,\n \"acc_norm\": 0.8374825731925911,\n\
\ \"acc_norm_stderr\": 0.0036817082825814566\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n \"\
acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229862,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229862\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5948717948717949,\n \"acc_stderr\": 0.024890471769938145,\n\
\ \"acc_norm\": 0.5948717948717949,\n \"acc_norm_stderr\": 0.024890471769938145\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.0279404571362284,\n \"acc_norm\":\
\ 0.3,\n \"acc_norm_stderr\": 0.0279404571362284\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059288,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059288\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010323,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451947,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451947\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.015721531075183884,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.015721531075183884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.455019556714472,\n\
\ \"acc_stderr\": 0.012718456618701763,\n \"acc_norm\": 0.455019556714472,\n\
\ \"acc_norm_stderr\": 0.012718456618701763\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988626,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6437908496732027,\n \"acc_stderr\": 0.019373332420724504,\n \
\ \"acc_norm\": 0.6437908496732027,\n \"acc_norm_stderr\": 0.019373332420724504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5372446307638827,\n\
\ \"mc2_stderr\": 0.015160471880409516\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025397\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48218347232752085,\n \
\ \"acc_stderr\": 0.013763738379867921\n }\n}\n```"
repo_url: https://huggingface.co/runkai/PascalHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-53-29.582984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T00-53-29.582984.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- '**/details_harness|winogrande|5_2024-03-01T00-53-29.582984.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T00-53-29.582984.parquet'
- config_name: results
data_files:
- split: 2024_03_01T00_53_29.582984
path:
- results_2024-03-01T00-53-29.582984.parquet
- split: latest
path:
- results_2024-03-01T00-53-29.582984.parquet
---
# Dataset Card for Evaluation run of runkai/PascalHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [runkai/PascalHermes-2.5-Mistral-7B](https://huggingface.co/runkai/PascalHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_runkai__PascalHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T00:53:29.582984](https://huggingface.co/datasets/open-llm-leaderboard/details_runkai__PascalHermes-2.5-Mistral-7B/blob/main/results_2024-03-01T00-53-29.582984.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6224846504972104,
"acc_stderr": 0.03249363469713295,
"acc_norm": 0.6261443120868132,
"acc_norm_stderr": 0.03313930679039025,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5372446307638827,
"mc2_stderr": 0.015160471880409516
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938215,
"acc_norm": 0.6382252559726962,
"acc_norm_stderr": 0.014041957945038083
},
"harness|hellaswag|10": {
"acc": 0.6430989842660825,
"acc_stderr": 0.004781061390873913,
"acc_norm": 0.8374825731925911,
"acc_norm_stderr": 0.0036817082825814566
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229862,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229862
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5948717948717949,
"acc_stderr": 0.024890471769938145,
"acc_norm": 0.5948717948717949,
"acc_norm_stderr": 0.024890471769938145
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.0279404571362284,
"acc_norm": 0.3,
"acc_norm_stderr": 0.0279404571362284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059288,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059288
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010323,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451947,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451947
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.015721531075183884,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.015721531075183884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.455019556714472,
"acc_stderr": 0.012718456618701763,
"acc_norm": 0.455019556714472,
"acc_norm_stderr": 0.012718456618701763
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988626,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6437908496732027,
"acc_stderr": 0.019373332420724504,
"acc_norm": 0.6437908496732027,
"acc_norm_stderr": 0.019373332420724504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5372446307638827,
"mc2_stderr": 0.015160471880409516
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025397
},
"harness|gsm8k|5": {
"acc": 0.48218347232752085,
"acc_stderr": 0.013763738379867921
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Jasshl/bathroom | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 8789374.0
num_examples: 149
download_size: 7893953
dataset_size: 8789374.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_NeverSleep__Mistral-11B-SynthIAirOmniMix | ---
pretty_name: Evaluation run of NeverSleep/Mistral-11B-SynthIAirOmniMix
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NeverSleep/Mistral-11B-SynthIAirOmniMix](https://huggingface.co/NeverSleep/Mistral-11B-SynthIAirOmniMix)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NeverSleep__Mistral-11B-SynthIAirOmniMix_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-12T19:54:58.939194](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Mistral-11B-SynthIAirOmniMix_public/blob/main/results_2023-11-12T19-54-58.939194.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6277127436205546,\n\
\ \"acc_stderr\": 0.03243061765974366,\n \"acc_norm\": 0.6378229900253635,\n\
\ \"acc_norm_stderr\": 0.03315507636067878,\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5568818997417452,\n\
\ \"mc2_stderr\": 0.015517245006607807,\n \"em\": 0.23259228187919462,\n\
\ \"em_stderr\": 0.004326636227794088,\n \"f1\": 0.28881291946308657,\n\
\ \"f1_stderr\": 0.004306419385994737\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449705,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6396136227843059,\n\
\ \"acc_stderr\": 0.004791313101877047,\n \"acc_norm\": 0.8313085042820155,\n\
\ \"acc_norm_stderr\": 0.003737138752336941\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.028815615713432115,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.028815615713432115\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.025107425481137282,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.025107425481137282\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.02805779167298901,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.02805779167298901\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n\
\ \"acc_stderr\": 0.01639943636661292,\n \"acc_norm\": 0.8220183486238533,\n\
\ \"acc_norm_stderr\": 0.01639943636661292\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.03050028317654585,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.03050028317654585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077816,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077816\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153186,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153186\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.016175692013381968,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.016175692013381968\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\
\ \"acc_stderr\": 0.012716941720734804,\n \"acc_norm\": 0.45436766623207303,\n\
\ \"acc_norm_stderr\": 0.012716941720734804\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3880048959608323,\n\
\ \"mc1_stderr\": 0.017058761501347972,\n \"mc2\": 0.5568818997417452,\n\
\ \"mc2_stderr\": 0.015517245006607807\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7640094711917916,\n \"acc_stderr\": 0.011933828850275626\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.23259228187919462,\n \
\ \"em_stderr\": 0.004326636227794088,\n \"f1\": 0.28881291946308657,\n\
\ \"f1_stderr\": 0.004306419385994737\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.11902956785443518,\n \"acc_stderr\": 0.00891970291116164\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NeverSleep/Mistral-11B-SynthIAirOmniMix
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|arc:challenge|25_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|drop|3_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|gsm8k|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hellaswag|10_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-54-58.939194.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-12T19-54-58.939194.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- '**/details_harness|winogrande|5_2023-11-12T19-54-58.939194.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-12T19-54-58.939194.parquet'
- config_name: results
data_files:
- split: 2023_11_12T19_54_58.939194
path:
- results_2023-11-12T19-54-58.939194.parquet
- split: latest
path:
- results_2023-11-12T19-54-58.939194.parquet
---
# Dataset Card for Evaluation run of NeverSleep/Mistral-11B-SynthIAirOmniMix
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NeverSleep/Mistral-11B-SynthIAirOmniMix
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NeverSleep/Mistral-11B-SynthIAirOmniMix](https://huggingface.co/NeverSleep/Mistral-11B-SynthIAirOmniMix) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NeverSleep__Mistral-11B-SynthIAirOmniMix_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-12T19:54:58.939194](https://huggingface.co/datasets/open-llm-leaderboard/details_NeverSleep__Mistral-11B-SynthIAirOmniMix_public/blob/main/results_2023-11-12T19-54-58.939194.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6277127436205546,
"acc_stderr": 0.03243061765974366,
"acc_norm": 0.6378229900253635,
"acc_norm_stderr": 0.03315507636067878,
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5568818997417452,
"mc2_stderr": 0.015517245006607807,
"em": 0.23259228187919462,
"em_stderr": 0.004326636227794088,
"f1": 0.28881291946308657,
"f1_stderr": 0.004306419385994737
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449705,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6396136227843059,
"acc_stderr": 0.004791313101877047,
"acc_norm": 0.8313085042820155,
"acc_norm_stderr": 0.003737138752336941
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.028815615713432115,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.028815615713432115
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.025107425481137282,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.025107425481137282
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.02805779167298901,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.02805779167298901
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.01639943636661292,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.01639943636661292
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.03050028317654585,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.03050028317654585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077816,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077816
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153186,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153186
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.016175692013381968,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.016175692013381968
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734804,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734804
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3880048959608323,
"mc1_stderr": 0.017058761501347972,
"mc2": 0.5568818997417452,
"mc2_stderr": 0.015517245006607807
},
"harness|winogrande|5": {
"acc": 0.7640094711917916,
"acc_stderr": 0.011933828850275626
},
"harness|drop|3": {
"em": 0.23259228187919462,
"em_stderr": 0.004326636227794088,
"f1": 0.28881291946308657,
"f1_stderr": 0.004306419385994737
},
"harness|gsm8k|5": {
"acc": 0.11902956785443518,
"acc_stderr": 0.00891970291116164
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dllllb/stihiru | ---
license: apache-2.0
task_categories:
- text2text-generation
tags:
- art
language:
- ru
--- |
loubnabnl/math_college | ---
dataset_info:
features:
- name: prompt_college
dtype: string
- name: token_length
dtype: int64
- name: completion
dtype: string
splits:
- name: train
num_bytes: 25108775
num_examples: 5000
download_size: 12716387
dataset_size: 25108775
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jlbaker361/league-maybe-openjourney-50 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
- name: seed
dtype: int64
- name: steps
dtype: int64
splits:
- name: train
num_bytes: 28991832.0
num_examples: 72
download_size: 28990936
dataset_size: 28991832.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_cloudyu__Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE | ---
pretty_name: Evaluation run of cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE](https://huggingface.co/cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cloudyu__Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T20:13:45.789253](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE/blob/main/results_2024-01-25T20-13-45.789253.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7669297681429887,\n\
\ \"acc_stderr\": 0.028190436925044526,\n \"acc_norm\": 0.7705423152798676,\n\
\ \"acc_norm_stderr\": 0.02872789012012348,\n \"mc1\": 0.5777233782129743,\n\
\ \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7328348537061722,\n\
\ \"mc2_stderr\": 0.01412262997996187\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244485,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545789\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6715793666600279,\n\
\ \"acc_stderr\": 0.0046867890424453695,\n \"acc_norm\": 0.865166301533559,\n\
\ \"acc_norm_stderr\": 0.003408478333768256\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7481481481481481,\n\
\ \"acc_stderr\": 0.03749850709174021,\n \"acc_norm\": 0.7481481481481481,\n\
\ \"acc_norm_stderr\": 0.03749850709174021\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.026293995855474938,\n\
\ \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.026293995855474938\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n\
\ \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.0289198029561349,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.0289198029561349\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818317,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818317\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7914893617021277,\n \"acc_stderr\": 0.026556982117838725,\n\
\ \"acc_norm\": 0.7914893617021277,\n \"acc_norm_stderr\": 0.026556982117838725\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.7354497354497355,\n \"acc_stderr\": 0.02271746789770862,\n \"\
acc_norm\": 0.7354497354497355,\n \"acc_norm_stderr\": 0.02271746789770862\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.896774193548387,\n \"acc_stderr\": 0.01730838128103451,\n \"acc_norm\"\
: 0.896774193548387,\n \"acc_norm_stderr\": 0.01730838128103451\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n\
\ \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n\
\ \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \
\ \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"\
acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.011464523356953162,\n\
\ \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.011464523356953162\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8205128205128205,\n \"acc_stderr\": 0.019457390787681803,\n\
\ \"acc_norm\": 0.8205128205128205,\n \"acc_norm_stderr\": 0.019457390787681803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4666666666666667,\n \"acc_stderr\": 0.030417716961717477,\n \
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.030417716961717477\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n\
\ \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9155963302752294,\n \"acc_stderr\": 0.011918819327334886,\n \"\
acc_norm\": 0.9155963302752294,\n \"acc_norm_stderr\": 0.011918819327334886\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"\
acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065522,\n \
\ \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065522\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n\
\ \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n\
\ \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342323,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342323\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540637,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540637\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.03038159675665167,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.03038159675665167\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.901840490797546,\n \"acc_stderr\": 0.0233761802310596,\n\
\ \"acc_norm\": 0.901840490797546,\n \"acc_norm_stderr\": 0.0233761802310596\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n\
\ \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.6071428571428571,\n\
\ \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n\
\ \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253862,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253862\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9106002554278416,\n\
\ \"acc_stderr\": 0.010203017847688298,\n \"acc_norm\": 0.9106002554278416,\n\
\ \"acc_norm_stderr\": 0.010203017847688298\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8236994219653179,\n \"acc_stderr\": 0.020516425672490714,\n\
\ \"acc_norm\": 0.8236994219653179,\n \"acc_norm_stderr\": 0.020516425672490714\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7787709497206704,\n\
\ \"acc_stderr\": 0.013882164598887293,\n \"acc_norm\": 0.7787709497206704,\n\
\ \"acc_norm_stderr\": 0.013882164598887293\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.020279402936174588,\n\
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.020279402936174588\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8392282958199357,\n\
\ \"acc_stderr\": 0.020862388082391884,\n \"acc_norm\": 0.8392282958199357,\n\
\ \"acc_norm_stderr\": 0.020862388082391884\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8796296296296297,\n \"acc_stderr\": 0.018105414094329676,\n\
\ \"acc_norm\": 0.8796296296296297,\n \"acc_norm_stderr\": 0.018105414094329676\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.648936170212766,\n \"acc_stderr\": 0.02847350127296375,\n \
\ \"acc_norm\": 0.648936170212766,\n \"acc_norm_stderr\": 0.02847350127296375\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5977835723598436,\n\
\ \"acc_stderr\": 0.012523646856180178,\n \"acc_norm\": 0.5977835723598436,\n\
\ \"acc_norm_stderr\": 0.012523646856180178\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559352,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559352\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8300653594771242,\n \"acc_stderr\": 0.01519415311318474,\n \
\ \"acc_norm\": 0.8300653594771242,\n \"acc_norm_stderr\": 0.01519415311318474\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736847,\n\
\ \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736847\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.038641399236991225,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.038641399236991225\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.024648068961366152,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.024648068961366152\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5777233782129743,\n\
\ \"mc1_stderr\": 0.017290733254248177,\n \"mc2\": 0.7328348537061722,\n\
\ \"mc2_stderr\": 0.01412262997996187\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8318863456985004,\n \"acc_stderr\": 0.010510336954166737\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \
\ \"acc_stderr\": 0.012513215297888463\n }\n}\n```"
repo_url: https://huggingface.co/cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|arc:challenge|25_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|gsm8k|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hellaswag|10_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-13-45.789253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T20-13-45.789253.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- '**/details_harness|winogrande|5_2024-01-25T20-13-45.789253.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T20-13-45.789253.parquet'
- config_name: results
data_files:
- split: 2024_01_25T20_13_45.789253
path:
- results_2024-01-25T20-13-45.789253.parquet
- split: latest
path:
- results_2024-01-25T20-13-45.789253.parquet
---
# Dataset Card for Evaluation run of cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE](https://huggingface.co/cloudyu/Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cloudyu__Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T20:13:45.789253](https://huggingface.co/datasets/open-llm-leaderboard/details_cloudyu__Truthful_DPO_TomGrc_FusionNet_34Bx2_MoE/blob/main/results_2024-01-25T20-13-45.789253.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7669297681429887,
"acc_stderr": 0.028190436925044526,
"acc_norm": 0.7705423152798676,
"acc_norm_stderr": 0.02872789012012348,
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7328348537061722,
"mc2_stderr": 0.01412262997996187
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244485,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545789
},
"harness|hellaswag|10": {
"acc": 0.6715793666600279,
"acc_stderr": 0.0046867890424453695,
"acc_norm": 0.865166301533559,
"acc_norm_stderr": 0.003408478333768256
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7481481481481481,
"acc_stderr": 0.03749850709174021,
"acc_norm": 0.7481481481481481,
"acc_norm_stderr": 0.03749850709174021
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.026293995855474938,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.026293995855474938
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.0289198029561349,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.0289198029561349
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818317,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818317
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7914893617021277,
"acc_stderr": 0.026556982117838725,
"acc_norm": 0.7914893617021277,
"acc_norm_stderr": 0.026556982117838725
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7354497354497355,
"acc_stderr": 0.02271746789770862,
"acc_norm": 0.7354497354497355,
"acc_norm_stderr": 0.02271746789770862
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.896774193548387,
"acc_stderr": 0.01730838128103451,
"acc_norm": 0.896774193548387,
"acc_norm_stderr": 0.01730838128103451
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6403940886699507,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.6403940886699507,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.011464523356953162,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.011464523356953162
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.019457390787681803,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.019457390787681803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.030417716961717477,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.030417716961717477
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9155963302752294,
"acc_stderr": 0.011918819327334886,
"acc_norm": 0.9155963302752294,
"acc_norm_stderr": 0.011918819327334886
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065522,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065522
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342323,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342323
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540637,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540637
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.03038159675665167,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.03038159675665167
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.901840490797546,
"acc_stderr": 0.0233761802310596,
"acc_norm": 0.901840490797546,
"acc_norm_stderr": 0.0233761802310596
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253862,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253862
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9106002554278416,
"acc_stderr": 0.010203017847688298,
"acc_norm": 0.9106002554278416,
"acc_norm_stderr": 0.010203017847688298
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8236994219653179,
"acc_stderr": 0.020516425672490714,
"acc_norm": 0.8236994219653179,
"acc_norm_stderr": 0.020516425672490714
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7787709497206704,
"acc_stderr": 0.013882164598887293,
"acc_norm": 0.7787709497206704,
"acc_norm_stderr": 0.013882164598887293
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.020279402936174588,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.020279402936174588
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8392282958199357,
"acc_stderr": 0.020862388082391884,
"acc_norm": 0.8392282958199357,
"acc_norm_stderr": 0.020862388082391884
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8796296296296297,
"acc_stderr": 0.018105414094329676,
"acc_norm": 0.8796296296296297,
"acc_norm_stderr": 0.018105414094329676
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.648936170212766,
"acc_stderr": 0.02847350127296375,
"acc_norm": 0.648936170212766,
"acc_norm_stderr": 0.02847350127296375
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5977835723598436,
"acc_stderr": 0.012523646856180178,
"acc_norm": 0.5977835723598436,
"acc_norm_stderr": 0.012523646856180178
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559352,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559352
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8300653594771242,
"acc_stderr": 0.01519415311318474,
"acc_norm": 0.8300653594771242,
"acc_norm_stderr": 0.01519415311318474
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736847,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736847
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.038641399236991225,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.038641399236991225
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.024648068961366152,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.024648068961366152
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5777233782129743,
"mc1_stderr": 0.017290733254248177,
"mc2": 0.7328348537061722,
"mc2_stderr": 0.01412262997996187
},
"harness|winogrande|5": {
"acc": 0.8318863456985004,
"acc_stderr": 0.010510336954166737
},
"harness|gsm8k|5": {
"acc": 0.7088703563305534,
"acc_stderr": 0.012513215297888463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
strombergnlp/named_timexes | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Named Temporal Expressions dataset
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- token-classification
task_ids: []
---
# Dataset Card for named_timexes
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:** [https://aclanthology.org/R13-1015/](https://aclanthology.org/R13-1015/)
- **Leaderboard:**
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
### Dataset Summary
This is a dataset annotated for _named temporal expression_ chunks.
The
commonest temporal expressions typically
contain date and time words, like April or
hours. Research into recognising and interpreting these typical expressions is mature in many languages. However, there is
a class of expressions that are less typical,
very varied, and difficult to automatically
interpret. These indicate dates and times,
but are harder to detect because they often do not contain time words and are not
used frequently enough to appear in conventional temporally-annotated corpora –
for example *Michaelmas* or *Vasant Panchami*.
For more details see [Recognising and Interpreting Named Temporal Expressions](https://aclanthology.org/R13-1015.pdf)
### Supported Tasks and Leaderboards
* Task: Named Entity Recognition (temporal expressions)
### Languages
Englsih
## Dataset Structure
### Data Instances
### Data Fields
Each tweet contains an ID, a list of tokens, and a list of timex chunk flags.
- `id`: a `string` feature.
- `tokens`: a `list` of `strings` .
- `ntimex_tags`: a `list` of class IDs (`int`s) for whether a token is out-of-timex or in a timex chunk.
```
0: O
1: T
```
### Data Splits
Section|Token count
---|---:
train|87 050
test|30 010
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Creative Commons Attribution 4.0 International (CC BY 4.0)
### Citation Information
```
@inproceedings{brucato-etal-2013-recognising,
title = "Recognising and Interpreting Named Temporal Expressions",
author = "Brucato, Matteo and
Derczynski, Leon and
Llorens, Hector and
Bontcheva, Kalina and
Jensen, Christian S.",
booktitle = "Proceedings of the International Conference Recent Advances in Natural Language Processing {RANLP} 2013",
month = sep,
year = "2013",
address = "Hissar, Bulgaria",
publisher = "INCOMA Ltd. Shoumen, BULGARIA",
url = "https://aclanthology.org/R13-1015",
pages = "113--121",
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
gavmac00/nextjs-app-docs | ---
license: cc-by-3.0
---
|
andersonbcdefg/fake_dataset | ---
dataset_info:
features:
- name: tokens
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 6240
num_examples: 8
download_size: 5472
dataset_size: 6240
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "fake_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jumtra/jglue_jnli | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 647839
num_examples: 3079
download_size: 196877
dataset_size: 647839
---
# Dataset Card for "jglue_jnli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Praghxx/Gielav2 | ---
license: openrail
---
|
noahshinn/ts-code2td | ---
license: mit
---
## Dataset Description
A dataset of pairs of TypeScript code to appropriate type declarations.
## Language
TypeScript only.
## To Load
```python
from datasets import load_dataset
load_dataset("noahshinn024/ts-code2td")
```
## Distribution of type declaration code lengths
- uses the tokenizer from [bigcode/santacoder](https://huggingface.co/bigcode/santacoder)

|
prooompt/test_dataset | ---
license: mit
---
# This dataset is for testing purposes... blah blah
## About the dataset
- Mixture of prompt and answer completions taken from Subnet18 ... |
strombergnlp/rumoureval_2019 | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets: []
task_categories:
- text-classification
task_ids:
- fact-checking
pretty_name: RumourEval 2019
tags:
- stance-detection
---
# Dataset Card for "rumoureval_2019"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://competitions.codalab.org/competitions/19938](https://competitions.codalab.org/competitions/19938)
- **Repository:** [https://figshare.com/articles/dataset/RumourEval_2019_data/8845580](https://figshare.com/articles/dataset/RumourEval_2019_data/8845580)
- **Paper:** [https://aclanthology.org/S19-2147/](https://aclanthology.org/S19-2147/), [https://arxiv.org/abs/1809.06683](https://arxiv.org/abs/1809.06683)
- **Point of Contact:** [Leon Derczynski](https://github.com/leondz)
- **Size of downloaded dataset files:**
- **Size of the generated dataset:**
- **Total amount of disk used:**
### Dataset Summary
Stance prediction task in English. The goal is to predict whether a given reply to a claim either supports, denies, questions, or simply comments on the claim. Ran as a SemEval task in 2019.
### Supported Tasks and Leaderboards
* SemEval 2019 task 1
### Languages
English of various origins, bcp47: `en`
## Dataset Structure
### Data Instances
#### polstance
An example of 'train' looks as follows.
```
{
'id': '0',
'source_text': 'Appalled by the attack on Charlie Hebdo in Paris, 10 - probably journalists - now confirmed dead. An attack on free speech everywhere.',
'reply_text': '@m33ryg @tnewtondunn @mehdirhasan Of course it is free speech, that\'s the definition of "free speech" to openly make comments or draw a pic!',
'label': 3
}
```
### Data Fields
- `id`: a `string` feature.
- `source_text`: a `string` expressing a claim/topic.
- `reply_text`: a `string` to be classified for its stance to the source.
- `label`: a class label representing the stance the text expresses towards the target. Full tagset with indices:
```
0: "support",
1: "deny",
2: "query",
3: "comment"
```
- `quoteID`: a `string` of the internal quote ID.
- `party`: a `string` describing the party affiliation of the quote utterer at the time of utterance.
- `politician`: a `string` naming the politician who uttered the quote.
### Data Splits
| name |instances|
|---------|----:|
|train|7 005|
|dev|2 425|
|test|2 945|
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
Twitter users
### Annotations
#### Annotation process
Detailed in [Analysing How People Orient to and Spread Rumours in Social Media by Looking at Conversational Threads](https://journals.plos.org/plosone/article/authors?id=10.1371/journal.pone.0150989)
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
The dataset is curated by the paper's authors.
### Licensing Information
The authors distribute this data under Creative Commons attribution license, CC-BY 4.0.
### Citation Information
```
@inproceedings{gorrell-etal-2019-semeval,
title = "{S}em{E}val-2019 Task 7: {R}umour{E}val, Determining Rumour Veracity and Support for Rumours",
author = "Gorrell, Genevieve and
Kochkina, Elena and
Liakata, Maria and
Aker, Ahmet and
Zubiaga, Arkaitz and
Bontcheva, Kalina and
Derczynski, Leon",
booktitle = "Proceedings of the 13th International Workshop on Semantic Evaluation",
month = jun,
year = "2019",
address = "Minneapolis, Minnesota, USA",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/S19-2147",
doi = "10.18653/v1/S19-2147",
pages = "845--854",
}
```
### Contributions
Author-added dataset [@leondz](https://github.com/leondz)
|
AdapterOcean/med_alpaca_standardized_cluster_46_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 21057250
num_examples: 38172
download_size: 10381676
dataset_size: 21057250
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_46_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KyonBS/itadoriKunoichiTsubaki | ---
license: openrail
---
|
furry-br/lute_v2 | ---
license: openrail
---
|
open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v2 | ---
pretty_name: Evaluation run of lvkaokao/llama2-7b-hf-chat-lora-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lvkaokao/llama2-7b-hf-chat-lora-v2](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T19:43:28.899115](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v2/blob/main/results_2023-09-17T19-43-28.899115.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.25723573825503354,\n\
\ \"em_stderr\": 0.004476419757548592,\n \"f1\": 0.31864408557046997,\n\
\ \"f1_stderr\": 0.004427420085857621,\n \"acc\": 0.42871444189201235,\n\
\ \"acc_stderr\": 0.010374814363571815\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.25723573825503354,\n \"em_stderr\": 0.004476419757548592,\n\
\ \"f1\": 0.31864408557046997,\n \"f1_stderr\": 0.004427420085857621\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \
\ \"acc_stderr\": 0.008563852506627476\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516155\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T19_43_28.899115
path:
- '**/details_harness|drop|3_2023-09-17T19-43-28.899115.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T19-43-28.899115.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T19_43_28.899115
path:
- '**/details_harness|gsm8k|5_2023-09-17T19-43-28.899115.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T19-43-28.899115.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T19_43_28.899115
path:
- '**/details_harness|winogrande|5_2023-09-17T19-43-28.899115.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T19-43-28.899115.parquet'
- config_name: results
data_files:
- split: 2023_09_17T19_43_28.899115
path:
- results_2023-09-17T19-43-28.899115.parquet
- split: latest
path:
- results_2023-09-17T19-43-28.899115.parquet
---
# Dataset Card for Evaluation run of lvkaokao/llama2-7b-hf-chat-lora-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lvkaokao/llama2-7b-hf-chat-lora-v2](https://huggingface.co/lvkaokao/llama2-7b-hf-chat-lora-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T19:43:28.899115](https://huggingface.co/datasets/open-llm-leaderboard/details_lvkaokao__llama2-7b-hf-chat-lora-v2/blob/main/results_2023-09-17T19-43-28.899115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.25723573825503354,
"em_stderr": 0.004476419757548592,
"f1": 0.31864408557046997,
"f1_stderr": 0.004427420085857621,
"acc": 0.42871444189201235,
"acc_stderr": 0.010374814363571815
},
"harness|drop|3": {
"em": 0.25723573825503354,
"em_stderr": 0.004476419757548592,
"f1": 0.31864408557046997,
"f1_stderr": 0.004427420085857621
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.008563852506627476
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.012185776220516155
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/metatree_fri_c2_1000_50 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: X
sequence: float64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 299460
num_examples: 713
- name: validation
num_bytes: 120540
num_examples: 287
download_size: 504473
dataset_size: 420000
---
# Dataset Card for "metatree_fri_c2_1000_50"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SEIEZ/test1-ru-pretrain-voque | ---
license: mit
---
|
CyberHarem/julia_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of julia (THE iDOLM@STER: Million Live!)
This is the dataset of julia (THE iDOLM@STER: Million Live!), containing 172 images and their tags.
The core tags of this character are `short_hair, red_hair, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 172 | 196.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 172 | 127.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 379 | 247.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 172 | 175.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 379 | 323.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/julia_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/julia_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, navel, solo, cleavage, collarbone, looking_at_viewer, medium_breasts, open_mouth, outdoors, blush, day, necklace, white_bikini, earrings, hair_between_eyes, hair_flower, smile, star_(symbol), cowboy_shot, frilled_bikini, front-tie_bikini_top, hibiscus, sky, straw_hat |
| 1 | 6 |  |  |  |  |  | 1girl, kimono, smile, solo, looking_at_viewer, hair_flower, blush, brown_hair, cherry_blossoms, petals |
| 2 | 14 |  |  |  |  |  | 1girl, electric_guitar, smile, solo, looking_at_viewer, star_(symbol), character_name, choker, plectrum, skirt, bracelet |
| 3 | 10 |  |  |  |  |  | 1girl, open_mouth, brown_hair, :d, skirt, choker, looking_at_viewer, solo, blush, dress, heart |
| 4 | 9 |  |  |  |  |  | 1girl, solo, blush, collarbone, looking_at_viewer, hair_between_eyes, bangs, breasts, upper_body, jewelry, open_mouth, smile, white_background, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | navel | solo | cleavage | collarbone | looking_at_viewer | medium_breasts | open_mouth | outdoors | blush | day | necklace | white_bikini | earrings | hair_between_eyes | hair_flower | smile | star_(symbol) | cowboy_shot | frilled_bikini | front-tie_bikini_top | hibiscus | sky | straw_hat | kimono | brown_hair | cherry_blossoms | petals | electric_guitar | character_name | choker | plectrum | skirt | bracelet | :d | dress | heart | bangs | breasts | upper_body | jewelry | white_background | shirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-----------|:-------------|:--------------------|:-----------------|:-------------|:-----------|:--------|:------|:-----------|:---------------|:-----------|:--------------------|:--------------|:--------|:----------------|:--------------|:-----------------|:-----------------------|:-----------|:------|:------------|:---------|:-------------|:------------------|:---------|:------------------|:-----------------|:---------|:-----------|:--------|:-----------|:-----|:--------|:--------|:--------|:----------|:-------------|:----------|:-------------------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | | X | | | X | | | | X | | | | | | X | X | | | | | | | | X | X | X | X | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | X | | | X | | | | | | | | | | | X | X | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | | X | | | X | | X | | X | | | | | | | | | | | | | | | | X | | | | | X | | X | | X | X | X | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | | X | X | | X | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
ZharfaTech/ZharfaTech-OpenAssistant-Guanaco-Persian-Farsi | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- fa
pretty_name: persian-guanaco
size_categories:
- 1K<n<10K
---
# Persian OpenAssistant-Guanaco Dataset
## About ZharfaTech
ZharfaTech is at the forefront of developing advanced Language Learning Models (LLMs) specifically for the Persian language, aiming to empower over 100 million Persian speakers worldwide. Our objective is to bridge the digital gap in services leveraging LLMs, such as content generation, translation, and customer relationship systems, by providing tailored open-source and closed-source LLM solutions. We focus on democratizing access to LLM technology for Persian language users, developers, and businesses, fostering innovation and collaboration within the community.
## Dataset Overview
This dataset is the Persian translation of the "openassistant-guanaco" dataset, originally found at [https://huggingface.co/datasets/timdettmers/openassistant-guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco). It has been translated to cater to the nuances of the Persian language, utilizing a high-performance local translation model. The translation process was completed in 12 hours using a single Nvidia GPU, ensuring a blend of speed and accuracy.
### Key Features:
- **Language:** Persian
- **Source:** Translated from "openassistant-guanaco"
- **Translation Method:** Local transitional model
- **Processing Time:** 12 hours on a single Nvidia GPU
## Objective and Scope
ZharfaTech is dedicated to enhancing the capabilities and reach of LLM technologies for the Persian language through:
- Development of fine-tuned open-source models for the Persian language.
- Creation of specialized datasets to support extensive training and refinement.
- Advanced closed-source model development for specialized solutions.
Our dual approach of fostering community collaboration and providing high-value, specialized solutions aims to advance LLM technologies for the Persian language, making significant strides towards inclusivity and accessibility in digital services.
## How to Use This Dataset
This dataset is intended for researchers, developers, and businesses interested in developing Persian language capabilities in their LLMs. It can be used to train models for a variety of applications, including but not limited to natural language understanding, content generation, and customer interaction systems.
To access and utilize this dataset, please follow the instructions below:
1. Visit our dataset page on Hugging Face: [https://huggingface.co/datasets/ZharfaTech/openassistant-guanaco-persian-instruct-fa]
2. Review the dataset documentation for details on structure and content.
3. Download the dataset using the provided Hugging Face commands or API.
## Contributing
We welcome contributions from the community to improve and expand this dataset.
## Acknowledgments
We extend our gratitude to the creators of the "openassistant-guanaco" dataset for providing the foundation for this translation. Our thanks also go to the dedicated team members who utilized their expertise to ensure the accuracy and relevance of this Persian translation.
## License
This dataset is available under an apache-2.0 license, aligning with the original "openassistant-guanaco" dataset's licensing terms. For more information, please review the license details on our dataset page.
## Contact Us
For more information about ZharfaTech and our projects, or if you have any questions regarding this dataset, please contact us at [https://zharfa.tech].
---
ZharfaTech: Empowering Persian language speakers with advanced LLM technology. |
YufeiHFUT/bioRED | ---
dataset_info:
features:
- name: data
dtype: string
splits:
- name: train
num_bytes: 13760785
num_examples: 3831
- name: validation
num_bytes: 4163807
num_examples: 1114
- name: test
num_bytes: 3637208
num_examples: 990
download_size: 2884661
dataset_size: 21561800
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Tgratzi/rule-viewer-tql | ---
dataset_info:
features:
- name: input
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 44791.25153374233
num_examples: 293
- name: test
num_bytes: 5044.748466257669
num_examples: 33
download_size: 17014
dataset_size: 49836.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
irds/trec-cast_v0 | ---
pretty_name: '`trec-cast/v0`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `trec-cast/v0`
The `trec-cast/v0` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/trec-cast#trec-cast/v0).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=47,696,605
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/trec-cast_v0', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Dalton2019Cast,
title={CAsT 2019: The Conversational Assistance Track Overview},
author={Jeffrey Dalton and Chenyan Xiong and Jamie Callan},
booktitle={TREC},
year={2019}
}
```
|
senhorsapo/eu | ---
license: openrail
---
|
louisbrulenaudet/code-procedure-penale | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code de procédure pénale
source_datasets:
- original
pretty_name: Code de procédure pénale
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code de procédure pénale, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
nath720/microso | ---
license: openrail
---
|
CyberHarem/voroshilov_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of voroshilov/ヴォロシーロフ/伏罗希洛夫 (Azur Lane)
This is the dataset of voroshilov/ヴォロシーロフ/伏罗希洛夫 (Azur Lane), containing 60 images and their tags.
The core tags of this character are `breasts, long_hair, blue_hair, large_breasts, bangs, purple_eyes, very_long_hair, hair_ornament, hair_flower`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 60 | 107.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 60 | 52.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 157 | 114.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 60 | 90.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 157 | 170.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/voroshilov_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/voroshilov_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, looking_at_viewer, solo, black_thighhighs, cleavage, bare_shoulders, flower, garter_straps, earrings, thighs, blush, white_dress, covered_navel, wide_sleeves, cowboy_shot, white_leotard, fur-trimmed_coat, parted_lips, simple_background, white_background, open_coat |
| 1 | 23 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, cleavage, collarbone, wet, naked_towel, thighs, bare_shoulders, sitting, closed_mouth, onsen, water, parted_lips, red_eyes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | black_thighhighs | cleavage | bare_shoulders | flower | garter_straps | earrings | thighs | blush | white_dress | covered_navel | wide_sleeves | cowboy_shot | white_leotard | fur-trimmed_coat | parted_lips | simple_background | white_background | open_coat | collarbone | wet | naked_towel | sitting | closed_mouth | onsen | water | red_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:-----------|:-----------------|:---------|:----------------|:-----------|:---------|:--------|:--------------|:----------------|:---------------|:--------------|:----------------|:-------------------|:--------------|:--------------------|:-------------------|:------------|:-------------|:------|:--------------|:----------|:---------------|:--------|:--------|:-----------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | | X | X | | | | X | X | | | | | | | X | | | | X | X | X | X | X | X | X | X |
|
jahb57/bert_embeddings_BATCH_10 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: last_hidden_state
sequence:
sequence: float32
- name: pooler_output
sequence: float32
splits:
- name: train
num_bytes: 19763873524
num_examples: 100000
download_size: 19888225526
dataset_size: 19763873524
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gguichard/wsd_myriade_synth_data_gpt4turbo_val_1 | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: wn_sens
sequence: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4969434
num_examples: 7903
download_size: 1136746
dataset_size: 4969434
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kz919/open-orca-flan-50k-synthetic-reward-e5-mistral-7b-instruct-v3 | ---
license: apache-2.0
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: task
dtype: string
- name: ignos-Mistral-T5-7B-v1
dtype: string
- name: cognAI-lil-c3po
dtype: string
- name: viethq188-Rabbit-7B-DPO-Chat
dtype: string
- name: cookinai-DonutLM-v1
dtype: string
- name: v1olet-v1olet-merged-dpo-7B
dtype: string
- name: normalized_rewards
sequence: float64
- name: router_label
dtype: int64
splits:
- name: train
num_bytes: 6770992
num_examples: 3067
download_size: 3143235
dataset_size: 6770992
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SINAI/Emoti-SP | ---
license: cc-by-nc-sa-4.0
language:
- es
pretty_name: Emoti-sp
tags:
- Opinion Mining
- Sentiment Analysis
size_categories:
- n<1K
---
### Dataset Description
**Paper**: [SINAI: voting system for twitter sentiment analysis](https://aclanthology.org/S14-2100.pdf)
**Point of Contact**: emcamara@ujaen.es, sjzafra@ujaen.es
### Licensing Information
Emoti-SP is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```bibtex
@inproceedings{martinez2014sinai,
title={SINAI: voting system for twitter sentiment analysis},
author={Mart{\'\i}nez-C{\'a}mara, Eugenio and Jim{\'e}nez-Zafra, Salud Maria and Martin-Valdivia, M Teresa and Lopez, L Alfonso Urena},
booktitle={Proceedings of the 8th International Workshop on Semantic Evaluation (SemEval 2014)},
pages={572--577},
year={2014}}
``` |
furry-br/angel-dustV2 | ---
license: openrail
---
|
Heng666/TED2020-TW-Corpus | ---
dataset_info:
- config_name: en-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 105192098
num_examples: 394054
download_size: 50558276
dataset_size: 105192098
- config_name: id-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 42245033
num_examples: 153365
download_size: 19374788
dataset_size: 42245033
- config_name: ja-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 101069421
num_examples: 351078
download_size: 47707306
dataset_size: 101069421
- config_name: ko-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 110871742
num_examples: 374075
download_size: 53243063
dataset_size: 110871742
- config_name: th-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 64742729
num_examples: 156328
download_size: 25868969
dataset_size: 64742729
- config_name: vi-zh_tw
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 95714104
num_examples: 314214
download_size: 43462345
dataset_size: 95714104
configs:
- config_name: en-zh_tw
data_files:
- split: train
path: en-zh_tw/train-*
- config_name: id-zh_tw
data_files:
- split: train
path: id-zh_tw/train-*
- config_name: ja-zh_tw
data_files:
- split: train
path: ja-zh_tw/train-*
- config_name: ko-zh_tw
data_files:
- split: train
path: ko-zh_tw/train-*
- config_name: th-zh_tw
data_files:
- split: train
path: th-zh_tw/train-*
- config_name: vi-zh_tw
data_files:
- split: train
path: vi-zh_tw/train-*
viewer: true
license: unknown
task_categories:
- translation
language:
- en
- ja
- ko
- id
- vi
- th
- tw
tags:
- taiwan
- translation
- Ted2020
pretty_name: TED2020-TW-Corpus
size_categories:
- 10M<n<100M
---
# Dataset Card for [TED2020-TW-Corpus]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [Heng-Shiou Sheu](mailto:hengshiousheu@gmail.com)
### Dataset Summary
TED2020 是一個機器翻譯基準的多語言資料集,源自 [OPUS](https://opus.nlpl.eu/TED2020/corpus/version/TED2020) 收集的使用者貢獻的翻譯,並由 [OPUS](https://opus.nlpl.eu/)。該資料集包括按語言對排序的測試和開發資料。它包括數百種語言對的測試集,並且不斷更新。請檢查版本號標籤以引用您正在使用的版本。
TED2020 收集了從1984年到2020年的演講,涵蓋了各種主題,包括科學、技術、藝術、教育、環境、社會問題等。該資料集是一個非常有價值的資源,可以用於研究和分析演講者的演講風格、主題的變化以及觀眾的反應。
### Supported Tasks and Leaderboards
### Languages
此資料集涵蓋數百種語言和語言對,並按 ISO-639-3 語言組織。目前版本涵蓋以下語言。繁體中文、英文、日文、韓文、印尼文、越南文、泰文
## Dataset Structure
### Data Instances
資料以 , 分隔檔案中內容,具有三個欄位:指示、輸入和輸出。請注意,我們並不暗示平移方向,並認為資料集是對稱的並用作兩個方向的測試集。
### Data Splits
先整理出 Train 資料。
## Dataset Creation
### Curation Rationale
本資料集將持續更新,未來將公開發佈於 Github 當中。高語言覆蓋率是本計畫的主要目標,資料集的準備與標準化語言標籤和分發格式保持一致和系統化。
### Source Data
#### Initial Data Collection and Normalization
TED2020 資料集是從提交到[OPUS - TED2020](https://opus.nlpl.eu/TED2020/corpus/version/TED2020) 的使用者貢獻的翻譯中收集的,並編譯成[OPUS](https://opus.nlpl.eu) 中的多並行語料庫)。
#### Who are the source language producers?
這些轉錄本已由全球志工社群翻譯為超過 100 種語言。平行語料庫及其驗證程式碼可從[TED](https://www.ted.com/participate/translate)取得
University of Helsinki及其[language_technology_research group](https://blogs.helsinki.fi/language-technology/) 管理。用於創建和使用資源的數據和工具是[開源](https://github.com/Helsinki-NLP/Tatoeba-Challenge/),並將作為[OPUS生態系統](https://opus.nlpl.eu/) 用於平行資料和機器翻譯研究。
### Personal and Sensitive Information
有關處理個人資訊和敏感資訊的信息,我們請諮詢資料的[原始提供者](https://opus.nlpl.eu/TED2020/corpus/version/TED2020)。該資料集未經過任何方式處理以檢測或刪除潛在的敏感資訊或個人資訊。
### Social Impact of Dataset
語言覆蓋率很高,因此它代表了機器翻譯開發的非常有價值的資源,特別是對於資源較少的語言和語言對。不斷成長的資料庫也代表著一種動態資源,其價值將進一步成長。
### Other Known Limitations
這些句子通常很短,因此很容易翻譯。對於高資源語言,這會導致結果不如更具挑戰性的基準有用。對於資源較少的語言對來說,即使在非常具有挑戰性的設定中,範例的有限複雜性實際上也是衡量進度的一件好事。
### Dataset Curators
此資料集由Heng-Shiou Sheu 製作。
### Licensing Information
這些資料集使用 [TED Talks Usage Policy](https://www.ted.com/about/our-organization/our-policies-terms/ted-talks-usage-policy) 。有關原始資料集使用條款的詳細資訊列於[此處](https://www.ted.com/about/our-organization/our-policies-terms/ted-talks-usage-policy)。
### Citation Information
```
@inproceedings{Heng666/TED2020-TW-Corpus,
title={Taiwanese Phrases Multilingual Translation Dataset from TED2020 Talks},
author={Heng-Shiou Sheu},
year={2024},
url={https://huggingface.co/datasets/Heng666/TED2020-TW-Corpus},
}
``` |
nateraw/beans | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
pretty_name: Beans
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- other
task_ids:
- other-other-image-classification
---
# Dataset Card for Beans
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**[Beans Homepage](https://github.com/AI-Lab-Makerere/ibean/)
- **Repository:**[AI-Lab-Makerere/ibean](https://github.com/AI-Lab-Makerere/ibean/)
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** N/A
### Dataset Summary
Beans leaf dataset with images of diseased and health leaves.
### Supported Tasks and Leaderboards
- image-classification
### Languages
English
## Dataset Structure
### Data Instances
A sample from the training set is provided below:
```
{
'image_file_path': '/root/.cache/huggingface/datasets/downloads/extracted/0aaa78294d4bf5114f58547e48d91b7826649919505379a167decb629aa92b0a/train/bean_rust/bean_rust_train.109.jpg',
'labels': 1
}
```
### Data Fields
The data instances have the following fields:
- `image_file_path`: a `string` filepath to an image.
- `labels`: an `int` classification label.
### Data Splits
| name |train|validation|test|
|----------|----:|----:|----:|
|beans|1034|133|128|
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@ONLINE {beansdata,
author="Makerere AI Lab",
title="Bean disease dataset",
month="January",
year="2020",
url="https://github.com/AI-Lab-Makerere/ibean/"
}
```
### Contributions
Thanks to [@nateraw](https://github.com/nateraw) for adding this dataset.
|
aaw222/new_train_data | ---
annotations_creators:
- expert-generated
- crowdsourced
- machine-generated
language_creators:
- crowdsourced
- expert-generated
language:
- afr
- amh
- ara
- asm
- ast
- azj
- bel
- ben
- bos
- cat
- ceb
- cmn
- ces
- cym
- dan
- deu
- ell
- eng
- spa
- est
- fas
- ful
- fin
- tgl
- fra
- gle
- glg
- guj
- hau
- heb
- hin
- hrv
- hun
- hye
- ind
- ibo
- isl
- ita
- jpn
- jav
- kat
- kam
- kea
- kaz
- khm
- kan
- kor
- ckb
- kir
- ltz
- lug
- lin
- lao
- lit
- luo
- lav
- mri
- mkd
- mal
- mon
- mar
- msa
- mlt
- mya
- nob
- npi
- nld
- nso
- nya
- oci
- orm
- ory
- pan
- pol
- pus
- por
- ron
- rus
- bul
- snd
- slk
- slv
- sna
- som
- srp
- swe
- swh
- tam
- tel
- tgk
- tha
- tur
- ukr
- umb
- urd
- uzb
- vie
- wol
- xho
- yor
- yue
- zul
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
task_categories:
- automatic-speech-recognition
task_ids: []
pretty_name: 'The Cross-lingual TRansfer Evaluation of Multilingual Encoders for Speech
(XTREME-S) benchmark is a benchmark designed to evaluate speech representations
across languages, tasks, domains and data regimes. It covers 102 languages from
10+ language families, 3 different domains and 4 task families: speech recognition,
translation, classification and retrieval.'
tags:
- speech-recognition
---
# FLEURS
## Dataset Description
- **Fine-Tuning script:** [pytorch/speech-recognition](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition)
- **Paper:** [FLEURS: Few-shot Learning Evaluation of
Universal Representations of Speech](https://arxiv.org/abs/2205.12446)
- **Total amount of disk used:** ca. 350 GB
Fleurs is the speech version of the [FLoRes machine translation benchmark](https://arxiv.org/abs/2106.03193).
We use 2009 n-way parallel sentences from the FLoRes dev and devtest publicly available sets, in 102 languages.
Training sets have around 10 hours of supervision. Speakers of the train sets are different than speakers from the dev/test sets. Multilingual fine-tuning is
used and ”unit error rate” (characters, signs) of all languages is averaged. Languages and results are also grouped into seven geographical areas:
- **Western Europe**: *Asturian, Bosnian, Catalan, Croatian, Danish, Dutch, English, Finnish, French, Galician, German, Greek, Hungarian, Icelandic, Irish, Italian, Kabuverdianu, Luxembourgish, Maltese, Norwegian, Occitan, Portuguese, Spanish, Swedish, Welsh*
- **Eastern Europe**: *Armenian, Belarusian, Bulgarian, Czech, Estonian, Georgian, Latvian, Lithuanian, Macedonian, Polish, Romanian, Russian, Serbian, Slovak, Slovenian, Ukrainian*
- **Central-Asia/Middle-East/North-Africa**: *Arabic, Azerbaijani, Hebrew, Kazakh, Kyrgyz, Mongolian, Pashto, Persian, Sorani-Kurdish, Tajik, Turkish, Uzbek*
- **Sub-Saharan Africa**: *Afrikaans, Amharic, Fula, Ganda, Hausa, Igbo, Kamba, Lingala, Luo, Northern-Sotho, Nyanja, Oromo, Shona, Somali, Swahili, Umbundu, Wolof, Xhosa, Yoruba, Zulu*
- **South-Asia**: *Assamese, Bengali, Gujarati, Hindi, Kannada, Malayalam, Marathi, Nepali, Oriya, Punjabi, Sindhi, Tamil, Telugu, Urdu*
- **South-East Asia**: *Burmese, Cebuano, Filipino, Indonesian, Javanese, Khmer, Lao, Malay, Maori, Thai, Vietnamese*
- **CJK languages**: *Cantonese and Mandarin Chinese, Japanese, Korean*
## How to use & Supported Tasks
### How to use
The `datasets` library allows you to load and pre-process your dataset in pure Python, at scale. The dataset can be downloaded and prepared in one call to your local drive by using the `load_dataset` function.
For example, to download the Hindi config, simply specify the corresponding language config name (i.e., "hi_in" for Hindi):
```python
from datasets import load_dataset
fleurs = load_dataset("google/fleurs", "hi_in", split="train")
```
Using the datasets library, you can also stream the dataset on-the-fly by adding a `streaming=True` argument to the `load_dataset` function call. Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire dataset to disk.
```python
from datasets import load_dataset
fleurs = load_dataset("google/fleurs", "hi_in", split="train", streaming=True)
print(next(iter(fleurs)))
```
*Bonus*: create a [PyTorch dataloader](https://huggingface.co/docs/datasets/use_with_pytorch) directly with your own datasets (local/streamed).
Local:
```python
from datasets import load_dataset
from torch.utils.data.sampler import BatchSampler, RandomSampler
fleurs = load_dataset("google/fleurs", "hi_in", split="train")
batch_sampler = BatchSampler(RandomSampler(fleurs), batch_size=32, drop_last=False)
dataloader = DataLoader(fleurs, batch_sampler=batch_sampler)
```
Streaming:
```python
from datasets import load_dataset
from torch.utils.data import DataLoader
fleurs = load_dataset("google/fleurs", "hi_in", split="train")
dataloader = DataLoader(fleurs, batch_size=32)
```
To find out more about loading and preparing audio datasets, head over to [hf.co/blog/audio-datasets](https://huggingface.co/blog/audio-datasets).
### Example scripts
Train your own CTC or Seq2Seq Automatic Speech Recognition models on FLEURS with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/speech-recognition).
Fine-tune your own Language Identification models on FLEURS with `transformers` - [here](https://github.com/huggingface/transformers/tree/main/examples/pytorch/audio-classification)
### 1. Speech Recognition (ASR)
```py
from datasets import load_dataset
fleurs_asr = load_dataset("google/fleurs", "af_za") # for Afrikaans
# to download all data for multi-lingual fine-tuning uncomment following line
# fleurs_asr = load_dataset("google/fleurs", "all")
# see structure
print(fleurs_asr)
# load audio sample on the fly
audio_input = fleurs_asr["train"][0]["audio"] # first decoded audio sample
transcription = fleurs_asr["train"][0]["transcription"] # first transcription
# use `audio_input` and `transcription` to fine-tune your model for ASR
# for analyses see language groups
all_language_groups = fleurs_asr["train"].features["lang_group_id"].names
lang_group_id = fleurs_asr["train"][0]["lang_group_id"]
all_language_groups[lang_group_id]
```
### 2. Language Identification
LangID can often be a domain classification, but in the case of FLEURS-LangID, recordings are done in a similar setting across languages and the utterances correspond to n-way parallel sentences, in the exact same domain, making this task particularly relevant for evaluating LangID. The setting is simple, FLEURS-LangID is splitted in train/valid/test for each language. We simply create a single train/valid/test for LangID by merging all.
```py
from datasets import load_dataset
fleurs_langID = load_dataset("google/fleurs", "all") # to download all data
# see structure
print(fleurs_langID)
# load audio sample on the fly
audio_input = fleurs_langID["train"][0]["audio"] # first decoded audio sample
language_class = fleurs_langID["train"][0]["lang_id"] # first id class
language = fleurs_langID["train"].features["lang_id"].names[language_class]
# use audio_input and language_class to fine-tune your model for audio classification
```
### 3. Retrieval
Retrieval provides n-way parallel speech and text data. Similar to how XTREME for text leverages Tatoeba to evaluate bitext mining a.k.a sentence translation retrieval, we use Retrieval to evaluate the quality of fixed-size representations of speech utterances. Our goal is to incentivize the creation of fixed-size speech encoder for speech retrieval. The system has to retrieve the English "key" utterance corresponding to the speech translation of "queries" in 15 languages. Results have to be reported on the test sets of Retrieval whose utterances are used as queries (and keys for English). We augment the English keys with a large number of utterances to make the task more difficult.
```py
from datasets import load_dataset
fleurs_retrieval = load_dataset("google/fleurs", "af_za") # for Afrikaans
# to download all data for multi-lingual fine-tuning uncomment following line
# fleurs_retrieval = load_dataset("google/fleurs", "all")
# see structure
print(fleurs_retrieval)
# load audio sample on the fly
audio_input = fleurs_retrieval["train"][0]["audio"] # decoded audio sample
text_sample_pos = fleurs_retrieval["train"][0]["transcription"] # positive text sample
text_sample_neg = fleurs_retrieval["train"][1:20]["transcription"] # negative text samples
# use `audio_input`, `text_sample_pos`, and `text_sample_neg` to fine-tune your model for retrieval
```
Users can leverage the training (and dev) sets of FLEURS-Retrieval with a ranking loss to build better cross-lingual fixed-size representations of speech.
## Dataset Structure
We show detailed information the example configurations `af_za` of the dataset.
All other configurations have the same structure.
### Data Instances
**af_za**
- Size of downloaded dataset files: 1.47 GB
- Size of the generated dataset: 1 MB
- Total amount of disk used: 1.47 GB
An example of a data instance of the config `af_za` looks as follows:
```
{'id': 91,
'num_samples': 385920,
'path': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/310a663d52322700b3d3473cbc5af429bd92a23f9bc683594e70bc31232db39e/home/vaxelrod/FLEURS/oss2_obfuscated/af_za/audio/train/17797742076841560615.wav',
'audio': {'path': '/home/patrick/.cache/huggingface/datasets/downloads/extracted/310a663d52322700b3d3473cbc5af429bd92a23f9bc683594e70bc31232db39e/home/vaxelrod/FLEURS/oss2_obfuscated/af_za/audio/train/17797742076841560615.wav',
'array': array([ 0.0000000e+00, 0.0000000e+00, 0.0000000e+00, ...,
-1.1205673e-04, -8.4638596e-05, -1.2731552e-04], dtype=float32),
'sampling_rate': 16000},
'raw_transcription': 'Dit is nog nie huidiglik bekend watter aantygings gemaak sal word of wat owerhede na die seun gelei het nie maar jeugmisdaad-verrigtinge het in die federale hof begin',
'transcription': 'dit is nog nie huidiglik bekend watter aantygings gemaak sal word of wat owerhede na die seun gelei het nie maar jeugmisdaad-verrigtinge het in die federale hof begin',
'gender': 0,
'lang_id': 0,
'language': 'Afrikaans',
'lang_group_id': 3}
```
### Data Fields
The data fields are the same among all splits.
- **id** (int): ID of audio sample
- **num_samples** (int): Number of float values
- **path** (str): Path to the audio file
- **audio** (dict): Audio object including loaded audio array, sampling rate and path ot audio
- **raw_transcription** (str): The non-normalized transcription of the audio file
- **transcription** (str): Transcription of the audio file
- **gender** (int): Class id of gender
- **lang_id** (int): Class id of language
- **lang_group_id** (int): Class id of language group
### Data Splits
Every config only has the `"train"` split containing of *ca.* 1000 examples, and a `"validation"` and `"test"` split each containing of *ca.* 400 examples.
## Dataset Creation
We collect between one and three recordings for each sentence (2.3 on average), and buildnew train-dev-test splits with 1509, 150 and 350 sentences for
train, dev and test respectively.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is meant to encourage the development of speech technology in a lot more languages of the world. One of the goal is to give equal access to technologies like speech recognition or speech translation to everyone, meaning better dubbing or better access to content from the internet (like podcasts, streaming or videos).
### Discussion of Biases
Most datasets have a fair distribution of gender utterances (e.g. the newly introduced FLEURS dataset). While many languages are covered from various regions of the world, the benchmark misses many languages that are all equally important. We believe technology built through FLEURS should generalize to all languages.
### Other Known Limitations
The dataset has a particular focus on read-speech because common evaluation benchmarks like CoVoST-2 or LibriSpeech evaluate on this type of speech. There is sometimes a known mismatch between performance obtained in a read-speech setting and a more noisy setting (in production for instance). Given the big progress that remains to be made on many languages, we believe better performance on FLEURS should still correlate well with actual progress made for speech understanding.
## Additional Information
All datasets are licensed under the [Creative Commons license (CC-BY)](https://creativecommons.org/licenses/).
### Citation Information
You can access the FLEURS paper at https://arxiv.org/abs/2205.12446.
Please cite the paper when referencing the FLEURS corpus as:
```
@article{fleurs2022arxiv,
title = {FLEURS: Few-shot Learning Evaluation of Universal Representations of Speech},
author = {Conneau, Alexis and Ma, Min and Khanuja, Simran and Zhang, Yu and Axelrod, Vera and Dalmia, Siddharth and Riesa, Jason and Rivera, Clara and Bapna, Ankur},
journal={arXiv preprint arXiv:2205.12446},
url = {https://arxiv.org/abs/2205.12446},
year = {2022},
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten) and [@aconneau](https://github.com/aconneau) for adding this dataset.
|
vishruthnath/Calc-Combined-Tagged | ---
dataset_info:
features:
- name: chain
dtype: string
- name: equation
dtype: string
- name: expression
dtype: string
- name: id
dtype: string
- name: num_unique_ops
dtype: int64
- name: operand
sequence: float64
- name: operand_tags
sequence: int64
- name: operation
dtype: string
- name: question
dtype: string
- name: question_split
sequence: string
- name: result
dtype: string
- name: result_float
dtype: float64
- name: valid
dtype: bool
- name: __index_level_0__
dtype: int64
- name: problem_type
dtype: string
- name: grade
dtype: int64
- name: result_unit
dtype: string
- name: source_question
dtype: string
splits:
- name: train
num_bytes: 2589638.721590909
num_examples: 3379
- name: test
num_bytes: 647601.2784090909
num_examples: 845
download_size: 888515
dataset_size: 3237240.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_Herry443__Mistral-7B-KNUT-ref-en | ---
pretty_name: Evaluation run of Herry443/Mistral-7B-KNUT-ref-en
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Herry443/Mistral-7B-KNUT-ref-en](https://huggingface.co/Herry443/Mistral-7B-KNUT-ref-en)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Herry443__Mistral-7B-KNUT-ref-en\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T15:03:53.204398](https://huggingface.co/datasets/open-llm-leaderboard/details_Herry443__Mistral-7B-KNUT-ref-en/blob/main/results_2024-03-24T15-03-53.204398.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24854758824381504,\n\
\ \"acc_stderr\": 0.03018986586250285,\n \"acc_norm\": 0.24192493252870081,\n\
\ \"acc_norm_stderr\": 0.030749026105024325,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155048,\n \"mc2\": 0.48926351753239467,\n\
\ \"mc2_stderr\": 0.015211344880077261\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35665529010238906,\n \"acc_stderr\": 0.013998056902620199,\n\
\ \"acc_norm\": 0.38993174061433444,\n \"acc_norm_stderr\": 0.014252959848892896\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5492929695279825,\n\
\ \"acc_stderr\": 0.004965473894646781,\n \"acc_norm\": 0.7070304720175263,\n\
\ \"acc_norm_stderr\": 0.004541944342035899\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n\
\ \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n\
\ \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n\
\ \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n\
\ \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.30599755201958384,\n \"mc1_stderr\": 0.016132229728155048,\n\
\ \"mc2\": 0.48926351753239467,\n \"mc2_stderr\": 0.015211344880077261\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.6345698500394633,\n\
\ \"acc_stderr\": 0.013533965097638795\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.444275966641395,\n \"acc_stderr\": 0.013686685712261663\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Herry443/Mistral-7B-KNUT-ref-en
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-03-53.204398.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T15-03-53.204398.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- '**/details_harness|winogrande|5_2024-03-24T15-03-53.204398.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T15-03-53.204398.parquet'
- config_name: results
data_files:
- split: 2024_03_24T15_03_53.204398
path:
- results_2024-03-24T15-03-53.204398.parquet
- split: latest
path:
- results_2024-03-24T15-03-53.204398.parquet
---
# Dataset Card for Evaluation run of Herry443/Mistral-7B-KNUT-ref-en
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Herry443/Mistral-7B-KNUT-ref-en](https://huggingface.co/Herry443/Mistral-7B-KNUT-ref-en) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Herry443__Mistral-7B-KNUT-ref-en",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T15:03:53.204398](https://huggingface.co/datasets/open-llm-leaderboard/details_Herry443__Mistral-7B-KNUT-ref-en/blob/main/results_2024-03-24T15-03-53.204398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24854758824381504,
"acc_stderr": 0.03018986586250285,
"acc_norm": 0.24192493252870081,
"acc_norm_stderr": 0.030749026105024325,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155048,
"mc2": 0.48926351753239467,
"mc2_stderr": 0.015211344880077261
},
"harness|arc:challenge|25": {
"acc": 0.35665529010238906,
"acc_stderr": 0.013998056902620199,
"acc_norm": 0.38993174061433444,
"acc_norm_stderr": 0.014252959848892896
},
"harness|hellaswag|10": {
"acc": 0.5492929695279825,
"acc_stderr": 0.004965473894646781,
"acc_norm": 0.7070304720175263,
"acc_norm_stderr": 0.004541944342035899
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155048,
"mc2": 0.48926351753239467,
"mc2_stderr": 0.015211344880077261
},
"harness|winogrande|5": {
"acc": 0.6345698500394633,
"acc_stderr": 0.013533965097638795
},
"harness|gsm8k|5": {
"acc": 0.444275966641395,
"acc_stderr": 0.013686685712261663
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
sachith-surge/orca-evaluated-falcon-gpt4-v2 | ---
dataset_info:
features:
- name: original_index
dtype: int64
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
- name: system_message
dtype: string
- name: explained_targets
dtype: string
- name: dataset_source
dtype: string
- name: falcon_status
dtype: string
- name: falcon_rating
dtype: string
- name: falcon_reason
dtype: string
- name: gpt4_status
dtype: string
- name: gpt4_rating
dtype: string
- name: gpt4_reason
dtype: string
splits:
- name: train
num_bytes: 6521750
num_examples: 3517
download_size: 3081179
dataset_size: 6521750
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "orca-evaluated-falcon-gpt4-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
newsgroup | ---
annotations_creators:
- found
language:
- en
language_creators:
- found
license:
- unknown
multilinguality:
- monolingual
pretty_name: 20 Newsgroups
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
paperswithcode_id: 20-newsgroups
dataset_info:
- config_name: 18828_alt.atheism
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1669511
num_examples: 799
download_size: 14666916
dataset_size: 1669511
- config_name: 18828_comp.graphics
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1661199
num_examples: 973
download_size: 14666916
dataset_size: 1661199
- config_name: 18828_comp.os.ms-windows.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2378739
num_examples: 985
download_size: 14666916
dataset_size: 2378739
- config_name: 18828_comp.sys.ibm.pc.hardware
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1185187
num_examples: 982
download_size: 14666916
dataset_size: 1185187
- config_name: 18828_comp.sys.mac.hardware
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1056264
num_examples: 961
download_size: 14666916
dataset_size: 1056264
- config_name: 18828_comp.windows.x
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1876297
num_examples: 980
download_size: 14666916
dataset_size: 1876297
- config_name: 18828_misc.forsale
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 925124
num_examples: 972
download_size: 14666916
dataset_size: 925124
- config_name: 18828_rec.autos
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1295307
num_examples: 990
download_size: 14666916
dataset_size: 1295307
- config_name: 18828_rec.motorcycles
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1206491
num_examples: 994
download_size: 14666916
dataset_size: 1206491
- config_name: 18828_rec.sport.baseball
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1369551
num_examples: 994
download_size: 14666916
dataset_size: 1369551
- config_name: 18828_rec.sport.hockey
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1758094
num_examples: 999
download_size: 14666916
dataset_size: 1758094
- config_name: 18828_sci.crypt
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2050727
num_examples: 991
download_size: 14666916
dataset_size: 2050727
- config_name: 18828_sci.electronics
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1237175
num_examples: 981
download_size: 14666916
dataset_size: 1237175
- config_name: 18828_sci.med
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1886363
num_examples: 990
download_size: 14666916
dataset_size: 1886363
- config_name: 18828_sci.space
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1812803
num_examples: 987
download_size: 14666916
dataset_size: 1812803
- config_name: 18828_soc.religion.christian
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2307486
num_examples: 997
download_size: 14666916
dataset_size: 2307486
- config_name: 18828_talk.politics.guns
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1922992
num_examples: 910
download_size: 14666916
dataset_size: 1922992
- config_name: 18828_talk.politics.mideast
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2910324
num_examples: 940
download_size: 14666916
dataset_size: 2910324
- config_name: 18828_talk.politics.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2102809
num_examples: 775
download_size: 14666916
dataset_size: 2102809
- config_name: 18828_talk.religion.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1374261
num_examples: 628
download_size: 14666916
dataset_size: 1374261
- config_name: 19997_alt.atheism
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2562277
num_examples: 1000
download_size: 17332201
dataset_size: 2562277
- config_name: 19997_comp.graphics
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2181673
num_examples: 1000
download_size: 17332201
dataset_size: 2181673
- config_name: 19997_comp.os.ms-windows.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2898760
num_examples: 1000
download_size: 17332201
dataset_size: 2898760
- config_name: 19997_comp.sys.ibm.pc.hardware
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1671166
num_examples: 1000
download_size: 17332201
dataset_size: 1671166
- config_name: 19997_comp.sys.mac.hardware
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1580881
num_examples: 1000
download_size: 17332201
dataset_size: 1580881
- config_name: 19997_comp.windows.x
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2418273
num_examples: 1000
download_size: 17332201
dataset_size: 2418273
- config_name: 19997_misc.forsale
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1412012
num_examples: 1000
download_size: 17332201
dataset_size: 1412012
- config_name: 19997_rec.autos
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1780502
num_examples: 1000
download_size: 17332201
dataset_size: 1780502
- config_name: 19997_rec.motorcycles
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1677964
num_examples: 1000
download_size: 17332201
dataset_size: 1677964
- config_name: 19997_rec.sport.baseball
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1835432
num_examples: 1000
download_size: 17332201
dataset_size: 1835432
- config_name: 19997_rec.sport.hockey
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2207282
num_examples: 1000
download_size: 17332201
dataset_size: 2207282
- config_name: 19997_sci.crypt
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2607835
num_examples: 1000
download_size: 17332201
dataset_size: 2607835
- config_name: 19997_sci.electronics
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1732199
num_examples: 1000
download_size: 17332201
dataset_size: 1732199
- config_name: 19997_sci.med
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2388789
num_examples: 1000
download_size: 17332201
dataset_size: 2388789
- config_name: 19997_sci.space
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2351411
num_examples: 1000
download_size: 17332201
dataset_size: 2351411
- config_name: 19997_soc.religion.christian
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2743018
num_examples: 997
download_size: 17332201
dataset_size: 2743018
- config_name: 19997_talk.politics.guns
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2639343
num_examples: 1000
download_size: 17332201
dataset_size: 2639343
- config_name: 19997_talk.politics.mideast
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3695931
num_examples: 1000
download_size: 17332201
dataset_size: 3695931
- config_name: 19997_talk.politics.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3169183
num_examples: 1000
download_size: 17332201
dataset_size: 3169183
- config_name: 19997_talk.religion.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 2658700
num_examples: 1000
download_size: 17332201
dataset_size: 2658700
- config_name: bydate_alt.atheism
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1042224
num_examples: 480
- name: test
num_bytes: 702920
num_examples: 319
download_size: 14464277
dataset_size: 1745144
- config_name: bydate_comp.graphics
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 911665
num_examples: 584
- name: test
num_bytes: 849632
num_examples: 389
download_size: 14464277
dataset_size: 1761297
- config_name: bydate_comp.os.ms-windows.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1770988
num_examples: 591
- name: test
num_bytes: 706676
num_examples: 394
download_size: 14464277
dataset_size: 2477664
- config_name: bydate_comp.sys.ibm.pc.hardware
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 800446
num_examples: 590
- name: test
num_bytes: 485310
num_examples: 392
download_size: 14464277
dataset_size: 1285756
- config_name: bydate_comp.sys.mac.hardware
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 696311
num_examples: 578
- name: test
num_bytes: 468791
num_examples: 385
download_size: 14464277
dataset_size: 1165102
- config_name: bydate_comp.windows.x
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1243463
num_examples: 593
- name: test
num_bytes: 795366
num_examples: 395
download_size: 14464277
dataset_size: 2038829
- config_name: bydate_misc.forsale
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 611210
num_examples: 585
- name: test
num_bytes: 415902
num_examples: 390
download_size: 14464277
dataset_size: 1027112
- config_name: bydate_rec.autos
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 860646
num_examples: 594
- name: test
num_bytes: 535378
num_examples: 396
download_size: 14464277
dataset_size: 1396024
- config_name: bydate_rec.motorcycles
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 811151
num_examples: 598
- name: test
num_bytes: 497735
num_examples: 398
download_size: 14464277
dataset_size: 1308886
- config_name: bydate_rec.sport.baseball
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 850740
num_examples: 597
- name: test
num_bytes: 618609
num_examples: 397
download_size: 14464277
dataset_size: 1469349
- config_name: bydate_rec.sport.hockey
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1189652
num_examples: 600
- name: test
num_bytes: 666358
num_examples: 399
download_size: 14464277
dataset_size: 1856010
- config_name: bydate_sci.crypt
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1502448
num_examples: 595
- name: test
num_bytes: 657727
num_examples: 396
download_size: 14464277
dataset_size: 2160175
- config_name: bydate_sci.electronics
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 814856
num_examples: 591
- name: test
num_bytes: 523095
num_examples: 393
download_size: 14464277
dataset_size: 1337951
- config_name: bydate_sci.med
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1195201
num_examples: 594
- name: test
num_bytes: 791826
num_examples: 396
download_size: 14464277
dataset_size: 1987027
- config_name: bydate_sci.space
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1197965
num_examples: 593
- name: test
num_bytes: 721771
num_examples: 394
download_size: 14464277
dataset_size: 1919736
- config_name: bydate_soc.religion.christian
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1358047
num_examples: 599
- name: test
num_bytes: 1003668
num_examples: 398
download_size: 14464277
dataset_size: 2361715
- config_name: bydate_talk.politics.guns
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1313019
num_examples: 546
- name: test
num_bytes: 701477
num_examples: 364
download_size: 14464277
dataset_size: 2014496
- config_name: bydate_talk.politics.mideast
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1765833
num_examples: 564
- name: test
num_bytes: 1236435
num_examples: 376
download_size: 14464277
dataset_size: 3002268
- config_name: bydate_talk.politics.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1328057
num_examples: 465
- name: test
num_bytes: 853395
num_examples: 310
download_size: 14464277
dataset_size: 2181452
- config_name: bydate_talk.religion.misc
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 835761
num_examples: 377
- name: test
num_bytes: 598452
num_examples: 251
download_size: 14464277
dataset_size: 1434213
---
# Dataset Card for "newsgroup"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [http://qwone.com/~jason/20Newsgroups/](http://qwone.com/~jason/20Newsgroups/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [NewsWeeder: Learning to Filter Netnews](https://doi.org/10.1016/B978-1-55860-377-6.50048-7)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 929.27 MB
- **Size of the generated dataset:** 124.41 MB
- **Total amount of disk used:** 1.05 GB
### Dataset Summary
The 20 Newsgroups data set is a collection of approximately 20,000 newsgroup documents, partitioned (nearly) evenly across
20 different newsgroups. To the best of my knowledge, it was originally collected by Ken Lang, probably for his Newsweeder:
Learning to filter netnews paper, though he does not explicitly mention this collection. The 20 newsgroups collection has become
a popular data set for experiments in text applications of machine learning techniques, such as text classification and text clustering.
does not include cross-posts and includes only the "From" and "Subject" headers.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### 18828_alt.atheism
- **Size of downloaded dataset files:** 14.67 MB
- **Size of the generated dataset:** 1.67 MB
- **Total amount of disk used:** 16.34 MB
An example of 'train' looks as follows.
```
```
#### 18828_comp.graphics
- **Size of downloaded dataset files:** 14.67 MB
- **Size of the generated dataset:** 1.66 MB
- **Total amount of disk used:** 16.33 MB
An example of 'train' looks as follows.
```
```
#### 18828_comp.os.ms-windows.misc
- **Size of downloaded dataset files:** 14.67 MB
- **Size of the generated dataset:** 2.38 MB
- **Total amount of disk used:** 17.05 MB
An example of 'train' looks as follows.
```
```
#### 18828_comp.sys.ibm.pc.hardware
- **Size of downloaded dataset files:** 14.67 MB
- **Size of the generated dataset:** 1.18 MB
- **Total amount of disk used:** 15.85 MB
An example of 'train' looks as follows.
```
```
#### 18828_comp.sys.mac.hardware
- **Size of downloaded dataset files:** 14.67 MB
- **Size of the generated dataset:** 1.06 MB
- **Total amount of disk used:** 15.73 MB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### 18828_alt.atheism
- `text`: a `string` feature.
#### 18828_comp.graphics
- `text`: a `string` feature.
#### 18828_comp.os.ms-windows.misc
- `text`: a `string` feature.
#### 18828_comp.sys.ibm.pc.hardware
- `text`: a `string` feature.
#### 18828_comp.sys.mac.hardware
- `text`: a `string` feature.
### Data Splits
| name |train|
|------------------------------|----:|
|18828_alt.atheism | 799|
|18828_comp.graphics | 973|
|18828_comp.os.ms-windows.misc | 985|
|18828_comp.sys.ibm.pc.hardware| 982|
|18828_comp.sys.mac.hardware | 961|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@incollection{LANG1995331,
title = {NewsWeeder: Learning to Filter Netnews},
editor = {Armand Prieditis and Stuart Russell},
booktitle = {Machine Learning Proceedings 1995},
publisher = {Morgan Kaufmann},
address = {San Francisco (CA)},
pages = {331-339},
year = {1995},
isbn = {978-1-55860-377-6},
doi = {https://doi.org/10.1016/B978-1-55860-377-6.50048-7},
url = {https://www.sciencedirect.com/science/article/pii/B9781558603776500487},
author = {Ken Lang},
}
```
### Contributions
Thanks to [@mariamabarham](https://github.com/mariamabarham), [@thomwolf](https://github.com/thomwolf), [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
open-llm-leaderboard/details_maywell__PiVoT-10.7B-Mistral-v0.2 | ---
pretty_name: Evaluation run of maywell/PiVoT-10.7B-Mistral-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [maywell/PiVoT-10.7B-Mistral-v0.2](https://huggingface.co/maywell/PiVoT-10.7B-Mistral-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_maywell__PiVoT-10.7B-Mistral-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T19:05:37.712893](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-10.7B-Mistral-v0.2/blob/main/results_2023-12-16T19-05-37.712893.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5992040625455914,\n\
\ \"acc_stderr\": 0.03324031031237355,\n \"acc_norm\": 0.6028778357395081,\n\
\ \"acc_norm_stderr\": 0.033924366555740444,\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5823109285763256,\n\
\ \"mc2_stderr\": 0.01521353248750615\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.591296928327645,\n \"acc_stderr\": 0.014365750345427,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.0140841331181043\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6161123282214698,\n\
\ \"acc_stderr\": 0.0048533716462392466,\n \"acc_norm\": 0.8167695678151763,\n\
\ \"acc_norm_stderr\": 0.0038606469988972836\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.03878139888797611,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.03878139888797611\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138204,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138204\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5846153846153846,\n \"acc_stderr\": 0.02498535492310234,\n \
\ \"acc_norm\": 0.5846153846153846,\n \"acc_norm_stderr\": 0.02498535492310234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291936,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291936\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045804,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045804\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\"\
: 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"\
acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724146,\n\
\ \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724146\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823302,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.014874252168095275,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.014874252168095275\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6699346405228758,\n \"acc_stderr\": 0.026925654653615697,\n\
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.026925654653615697\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44132985658409385,\n\
\ \"acc_stderr\": 0.01268201633564667,\n \"acc_norm\": 0.44132985658409385,\n\
\ \"acc_norm_stderr\": 0.01268201633564667\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5845588235294118,\n \"acc_stderr\": 0.029935342707877746,\n\
\ \"acc_norm\": 0.5845588235294118,\n \"acc_norm_stderr\": 0.029935342707877746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6274509803921569,\n \"acc_stderr\": 0.019559646809215923,\n \
\ \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.019559646809215923\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.031717528240626645,\n\
\ \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.031717528240626645\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328913,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4186046511627907,\n\
\ \"mc1_stderr\": 0.01727001528447686,\n \"mc2\": 0.5823109285763256,\n\
\ \"mc2_stderr\": 0.01521353248750615\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42380591357088704,\n \
\ \"acc_stderr\": 0.01361163200881036\n }\n}\n```"
repo_url: https://huggingface.co/maywell/PiVoT-10.7B-Mistral-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-05-37.712893.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-05-37.712893.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- '**/details_harness|winogrande|5_2023-12-16T19-05-37.712893.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T19-05-37.712893.parquet'
- config_name: results
data_files:
- split: 2023_12_16T19_05_37.712893
path:
- results_2023-12-16T19-05-37.712893.parquet
- split: latest
path:
- results_2023-12-16T19-05-37.712893.parquet
---
# Dataset Card for Evaluation run of maywell/PiVoT-10.7B-Mistral-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [maywell/PiVoT-10.7B-Mistral-v0.2](https://huggingface.co/maywell/PiVoT-10.7B-Mistral-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_maywell__PiVoT-10.7B-Mistral-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T19:05:37.712893](https://huggingface.co/datasets/open-llm-leaderboard/details_maywell__PiVoT-10.7B-Mistral-v0.2/blob/main/results_2023-12-16T19-05-37.712893.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5992040625455914,
"acc_stderr": 0.03324031031237355,
"acc_norm": 0.6028778357395081,
"acc_norm_stderr": 0.033924366555740444,
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5823109285763256,
"mc2_stderr": 0.01521353248750615
},
"harness|arc:challenge|25": {
"acc": 0.591296928327645,
"acc_stderr": 0.014365750345427,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.0140841331181043
},
"harness|hellaswag|10": {
"acc": 0.6161123282214698,
"acc_stderr": 0.0048533716462392466,
"acc_norm": 0.8167695678151763,
"acc_norm_stderr": 0.0038606469988972836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.03878139888797611,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.03878139888797611
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138204,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138204
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5846153846153846,
"acc_stderr": 0.02498535492310234,
"acc_norm": 0.5846153846153846,
"acc_norm_stderr": 0.02498535492310234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291936,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045804,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045804
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724146,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724146
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823302,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.014874252168095275,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.014874252168095275
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.026925654653615697,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.026925654653615697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44132985658409385,
"acc_stderr": 0.01268201633564667,
"acc_norm": 0.44132985658409385,
"acc_norm_stderr": 0.01268201633564667
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5845588235294118,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.5845588235294118,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.019559646809215923,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.019559646809215923
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.031717528240626645,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.031717528240626645
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328913,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4186046511627907,
"mc1_stderr": 0.01727001528447686,
"mc2": 0.5823109285763256,
"mc2_stderr": 0.01521353248750615
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
},
"harness|gsm8k|5": {
"acc": 0.42380591357088704,
"acc_stderr": 0.01361163200881036
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nunonmg/wmt22_w_shots_from_gptmt | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: answer
dtype: string
- name: examples
list:
- name: source
dtype: string
- name: target
dtype: string
- name: lp
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 26661051
num_examples: 11986
download_size: 14342583
dataset_size: 26661051
---
# Dataset Card for "wmt22_w_shots_from_gptmt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
e-mohammadii/aaaa | ---
license: creativeml-openrail-m
task_categories:
- question-answering
language:
- ae
tags:
- legal
size_categories:
- 1M<n<10M
--- |
open-llm-leaderboard/details_cookinai__DonutLM-v1 | ---
pretty_name: Evaluation run of cookinai/DonutLM-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cookinai/DonutLM-v1](https://huggingface.co/cookinai/DonutLM-v1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cookinai__DonutLM-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-23T17:20:03.494171](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__DonutLM-v1/blob/main/results_2023-12-23T17-20-03.494171.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6576690631133046,\n\
\ \"acc_stderr\": 0.03192024452939422,\n \"acc_norm\": 0.6585907567051082,\n\
\ \"acc_norm_stderr\": 0.032571444037302465,\n \"mc1\": 0.4602203182374541,\n\
\ \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6336336766166446,\n\
\ \"mc2_stderr\": 0.015095668911066656\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892978,\n\
\ \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6667994423421629,\n\
\ \"acc_stderr\": 0.004703942346762255,\n \"acc_norm\": 0.8590918143796057,\n\
\ \"acc_norm_stderr\": 0.003472157511639361\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700914,\n\
\ \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700914\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.0356760379963917,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.0356760379963917\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6127659574468085,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.6127659574468085,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.02971914287634285,\n \
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.02971914287634285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028072,\n \"\
acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028072\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8227848101265823,\n \"acc_stderr\": 0.024856364184503224,\n \
\ \"acc_norm\": 0.8227848101265823,\n \"acc_norm_stderr\": 0.024856364184503224\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794087,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794087\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037182,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037182\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8378033205619413,\n\
\ \"acc_stderr\": 0.013182222616720885,\n \"acc_norm\": 0.8378033205619413,\n\
\ \"acc_norm_stderr\": 0.013182222616720885\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"\
acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4602203182374541,\n\
\ \"mc1_stderr\": 0.01744801722396088,\n \"mc2\": 0.6336336766166446,\n\
\ \"mc2_stderr\": 0.015095668911066656\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8168902920284136,\n \"acc_stderr\": 0.010869778633168367\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6679302501895376,\n \
\ \"acc_stderr\": 0.012972465034361863\n }\n}\n```"
repo_url: https://huggingface.co/cookinai/DonutLM-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|arc:challenge|25_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|gsm8k|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hellaswag|10_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-23T17-20-03.494171.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- '**/details_harness|winogrande|5_2023-12-23T17-20-03.494171.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-23T17-20-03.494171.parquet'
- config_name: results
data_files:
- split: 2023_12_23T17_20_03.494171
path:
- results_2023-12-23T17-20-03.494171.parquet
- split: latest
path:
- results_2023-12-23T17-20-03.494171.parquet
---
# Dataset Card for Evaluation run of cookinai/DonutLM-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cookinai/DonutLM-v1](https://huggingface.co/cookinai/DonutLM-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cookinai__DonutLM-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-23T17:20:03.494171](https://huggingface.co/datasets/open-llm-leaderboard/details_cookinai__DonutLM-v1/blob/main/results_2023-12-23T17-20-03.494171.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6576690631133046,
"acc_stderr": 0.03192024452939422,
"acc_norm": 0.6585907567051082,
"acc_norm_stderr": 0.032571444037302465,
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6336336766166446,
"mc2_stderr": 0.015095668911066656
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892978,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344
},
"harness|hellaswag|10": {
"acc": 0.6667994423421629,
"acc_stderr": 0.004703942346762255,
"acc_norm": 0.8590918143796057,
"acc_norm_stderr": 0.003472157511639361
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700914,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700914
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.0356760379963917,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.0356760379963917
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6127659574468085,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.6127659574468085,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.02971914287634285,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.02971914287634285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028072,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028072
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8227848101265823,
"acc_stderr": 0.024856364184503224,
"acc_norm": 0.8227848101265823,
"acc_norm_stderr": 0.024856364184503224
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794087,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037182,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037182
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8378033205619413,
"acc_stderr": 0.013182222616720885,
"acc_norm": 0.8378033205619413,
"acc_norm_stderr": 0.013182222616720885
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4602203182374541,
"mc1_stderr": 0.01744801722396088,
"mc2": 0.6336336766166446,
"mc2_stderr": 0.015095668911066656
},
"harness|winogrande|5": {
"acc": 0.8168902920284136,
"acc_stderr": 0.010869778633168367
},
"harness|gsm8k|5": {
"acc": 0.6679302501895376,
"acc_stderr": 0.012972465034361863
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
UCLA-AGI/SPIN_iter1 | ---
license: apache-2.0
dataset_info:
features:
- name: real
list:
- name: content
dtype: string
- name: role
dtype: string
- name: generated
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 214923816
num_examples: 49792
- name: test
num_bytes: 2150878
num_examples: 500
download_size: 121081852
dataset_size: 217074694
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
khoomeik/gzipscale-code-C-256M | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 1028119248
num_examples: 1000116
download_size: 299522703
dataset_size: 1028119248
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bartoszmaj/process | ---
license: openrail
dataset_info:
features:
- name: nouns
dtype: string
splits:
- name: train
num_bytes: 1726847115
num_examples: 4600698
download_size: 998611496
dataset_size: 1726847115
---
|
TheMrguiller/BilbaoCaptions | ---
dataset_info:
features:
- name: caption
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 1372144989.6
num_examples: 3960
- name: test
num_bytes: 343036247.4
num_examples: 990
download_size: 1709055735
dataset_size: 1715181237
language:
- en
tags:
- code
size_categories:
- 100B<n<1T
---
# Dataset Card for "BilbaoCaptions"
## Dataset Description
- **Homepage:** https://github.com/TheMrguiller/MUCSI_Modal
- **Repository:** https://github.com/TheMrguiller/MUCSI_Modal
- **Paper:** It is a follow up of the Flamingo model paper
- **Leaderboard:**
- **Point of Contact:** https://github.com/TheMrguiller/MUCSI_Modal
### Dataset Summary
This dataset was collected for a proyect for a master degree in Computation and Intelligent System from University of Deusto. It was done by students and recolected from webpages famous in the Basque Country: Deia and Getimages.
### Supported Tasks and Leaderboards
The dataset is prepared to used it for visual question-answering.
### Languages
The dataset is in english.
## Dataset Structure
### Data Fields
- `Caption`: This field has the description of the image.
- `Image`: This field has the image corresponding to the description.
### Data Splits
The dataset is split in 80% train and 20% test.
## Considerations for Using the Data
The dataset has some flaws regarding to the descriptions. The descriptions sometimes are to specific for a captioning task. There are also to many futbol match data, so it isnt to well balanced. There are also some description that are to generic.
## Additional Information
### Dataset Curators
The curators of this dataset where the students from the Masters degree in Computation and Inteligent Systems from University of Deusto.
|
TrainingDataPro/chest-x-rays | ---
language:
- en
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- image-to-image
tags:
- medical
- code
- biology
dataset_info:
features:
- name: image
dtype: image
- name: type
dtype: string
splits:
- name: train
num_bytes: 325782340.0
num_examples: 97
download_size: 313593688
dataset_size: 325782340.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Chest X-ray
The dataset consists of a collection of chest X-ray images in **.jpg and .dcm** formats. The images are organized into folders based on different medical conditions related to the chest. Each folder contains images depicting specific chest abnormalities.
### Types of diseases and conditions in the dataset:
*Abscess, Ards, Atelectasis, Atherosclerosis of the aorta, Cardiomegaly, Emphysema, Fracture, Hydropneumothorax, Hydrothorax, Pneumonia, Pneumosclerosis, Post inflammatory changes, Post traumatic ribs deformation, Sarcoidosis, Scoliosis, Tuberculosis and Venous congestion*

The dataset is valuable for research in **neurology, radiology, and oncology**. It allows the development and evaluation of computer-based algorithms, machine learning models, and deep learning techniques for **automated detection, diagnosis, and classification** of these conditions.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/chest-x-ray-image?utm_source=huggingface&utm_medium=cpc&utm_campaign=chest-x-rays) to discuss your requirements, learn about the price and buy the dataset.
# Content
### The folder "files" includes 17 folders:
- corresponding to name of the disease/condition and including x-rays of people with this disease/condition (**abscess, ards, atelectasis etc.**)
- including x-rays in 2 different formats: **.jpg and .dcm**.
### File with the extension .csv includes the following information for each media file:
- **dcm**: link to access the .dcm file,
- **jpg**: link to access the .jpg file,
- **type**: name of the disease or condition on the x-ray
# Medical data might be collected in accordance with your requirements.
## **[TrainingData](https://trainingdata.pro/data-market/chest-x-ray-image?utm_source=huggingface&utm_medium=cpc&utm_campaign=chest-x-rays)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/trainingdata-pro** |
Maeda-miyazaki/dataset_information_extraction | ---
license: cc-by-3.0
---
|
open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9 | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-34b-v1.9
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-34b-v1.9](https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T13:29:15.296218](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9/blob/main/results_2023-10-28T13-29-15.296218.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.29771392617449666,\n\
\ \"em_stderr\": 0.004682699129958643,\n \"f1\": 0.3473626258389263,\n\
\ \"f1_stderr\": 0.004601090689469596,\n \"acc\": 0.4917554915020767,\n\
\ \"acc_stderr\": 0.012144352555904984\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.29771392617449666,\n \"em_stderr\": 0.004682699129958643,\n\
\ \"f1\": 0.3473626258389263,\n \"f1_stderr\": 0.004601090689469596\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24791508718726307,\n \
\ \"acc_stderr\": 0.01189398021482617\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7355958958168903,\n \"acc_stderr\": 0.012394724896983799\n\
\ }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T13_29_15.296218
path:
- '**/details_harness|drop|3_2023-10-28T13-29-15.296218.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T13-29-15.296218.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T13_29_15.296218
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-29-15.296218.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T13-29-15.296218.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-44-59.061253.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-08T20-44-59.061253.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T13_29_15.296218
path:
- '**/details_harness|winogrande|5_2023-10-28T13-29-15.296218.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T13-29-15.296218.parquet'
- config_name: results
data_files:
- split: 2023_10_08T20_44_59.061253
path:
- results_2023-10-08T20-44-59.061253.parquet
- split: 2023_10_28T13_29_15.296218
path:
- results_2023-10-28T13-29-15.296218.parquet
- split: latest
path:
- results_2023-10-28T13-29-15.296218.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-34b-v1.9
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-34b-v1.9](https://huggingface.co/uukuguy/speechless-codellama-34b-v1.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T13:29:15.296218](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-34b-v1.9/blob/main/results_2023-10-28T13-29-15.296218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.29771392617449666,
"em_stderr": 0.004682699129958643,
"f1": 0.3473626258389263,
"f1_stderr": 0.004601090689469596,
"acc": 0.4917554915020767,
"acc_stderr": 0.012144352555904984
},
"harness|drop|3": {
"em": 0.29771392617449666,
"em_stderr": 0.004682699129958643,
"f1": 0.3473626258389263,
"f1_stderr": 0.004601090689469596
},
"harness|gsm8k|5": {
"acc": 0.24791508718726307,
"acc_stderr": 0.01189398021482617
},
"harness|winogrande|5": {
"acc": 0.7355958958168903,
"acc_stderr": 0.012394724896983799
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Codec-SUPERB/vocalset_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 50687871
num_examples: 3612
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 50687871
num_examples: 3612
- name: academicodec_hifi_24k_320d
num_bytes: 75978975
num_examples: 3612
- name: audiodec_24k_320d
num_bytes: 162197087
num_examples: 3612
- name: dac_16k
num_bytes: 314926879
num_examples: 3612
- name: dac_24k
num_bytes: 886781599
num_examples: 3612
- name: dac_44k
num_bytes: 263117839
num_examples: 3612
- name: encodec_24k
num_bytes: 38100911
num_examples: 3612
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 405680543
num_examples: 3612
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 405680543
num_examples: 3612
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 405679007
num_examples: 3612
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 203356319
num_examples: 3612
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 405679007
num_examples: 3612
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 405679007
num_examples: 3612
- name: speech_tokenizer_16k
num_bytes: 101500127
num_examples: 3612
download_size: 652611283
dataset_size: 4175733585
---
# Dataset Card for "vocalset_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eleldar/rtsd_cleaned | ---
dataset_info:
features:
- name: image
dtype: image
- name: sign_class
dtype: string
- name: sign_id
dtype: int64
splits:
- name: train
num_bytes: -515611439.904
num_examples: 104358
download_size: 58343345
dataset_size: -515611439.904
---
# Cleaned russian traffic sign images dataset
Dataset is generated from [Russian traffic sign images dataset](https://www.kaggle.com/datasets/watchman/rtsd-dataset) and [detected signs in the dataset](https://graphics.cs.msu.ru/projects/traffic-sign-recognition.html). |
mzellou/yolo-windmill-fr | ---
license: etalab-2.0
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': images
'1': train
'2': val
splits:
- name: train
num_bytes: 81574182957.536
num_examples: 1796
- name: validation
num_bytes: 11408777591.0
num_examples: 436
download_size: 52014966566
dataset_size: 92982960548.536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
zwn22/NC_Crime | ---
license: unknown
language:
- en
tags:
- legal
---
# North Carolina(RTP) Police Incident Dataset
## Dataset Description
- **Homepage:** The processed dataset is available at the following GitHub portal: https://raw.githubusercontent.com/zening-wang2023/NC-Crime-Dataset/main/NC_v1.csv.zip. For the raw datasets, their respective homepages are:
- **Cary**:
- [Cary Open Data Portal - CPD Incidents](https://data.townofcary.org/explore/dataset/cpd-incidents/information/?disjunctive.crime_category&disjunctive.crime_type&disjunctive.crimeday&disjunctive.district&disjunctive.offensecategory&disjunctive.violentproperty&disjunctive.total_incidents&disjunctive.year&sort=date_from)
- **Chapel Hill**:
- [Chapel Hill Open Data Portal - Police Incidents](https://opendata-townofchapelhill.hub.arcgis.com/datasets/a761c9be03ef474bbbf4a114778623c5/explore?filters=eyJEYXRlX29mX1JlcG9ydCI6WzEyNjIzMDYxNjAwMDAsMTY5ODcwOTA4MDAwMF19&showTable=true)
- **Durham**:
- [Durham Open Data Portal - DPD Incidents UCR/NIBRS Reporting](https://live-durhamnc.opendata.arcgis.com/documents/DurhamNC::dpd-incidents-ucr-nibrs-reporting/about)
- **Raleigh**:
- [Raleigh Open Data Portal - Police Incidents (NIBRS)](https://data.raleighnc.gov/datasets/ral::raleigh-police-incidents-nibrs/explore?filters=eyJyZXBvcnRlZF95ZWFyIjpbMjAyNCwyMDI0XX0%3D&location=35.779792%2C-78.678454%2C11.17&showTable=true)
- **Point of Contact:** For any issues related to the raw datasets, please reach out to the respective government offices. For inquiries or issues regarding the processed data, you can contact zwn22 at Huggingface.
- **Example Usage:** [Colab](https://colab.research.google.com/drive/1K38qMX2_P_hMoZBeoMleBNZFnFKD8_4X?usp=sharing)
### Dataset Summary
The dataset is compiled from public police incident reports from multiple cities within North Carolina's Research Triangle Park (RTP), encompassing the years 2015 to 2024. Sourced from the open data portals of Cary, Chapel Hill, Durham, and Raleigh, the dataset was meticulously merged and cleaned to remove any incomplete entries. The dataset underwent a process of merging data from these cities, followed by cleaning to remove incomplete rows. Additionally, the dataset focuses on extracting and categorizing major crime types, providing valuable information such as crime type, time, location of occurrence, and other relevant details.
### Supported Tasks
1. **Crime Trend Analysis**: Analyzing crime trends over time and across different locations. This could involve identifying patterns in crime rates, seasonal variations, or shifts in the types of crimes committed.
2. **Predictive Policing**: Developing models to predict future crime occurrences based on historical data. This could help in resource allocation and proactive policing strategies.
3. **Geospatial Analysis**: Mapping crime incidents to identify hotspots and regions with higher crime rates. This can aid in understanding geographical factors influencing crime and in deploying resources more effectively.
### Languages
English
## Dataset Structure
### Data Instances
Here is an illustrative example from the processed dataset (note that specific details are subject to change):
```json
{
"year": 2022,
"city": "Raleigh",
"crime_major_category": "Theft",
"crime_detail": "Vehicle Theft",
"latitude": 35.7796,
"longitude": -78.6382,
"occurance_time": "2022/05/20 12:00:00"
"clear_status": "Cleared by Arrest",
"incident_address": "123 Main St, Raleigh, NC",
"notes": "Weapon: None",
"crime_severity": "Minor"
}
```
### Data Fields
The dataset contains several fields, each providing specific information about police incidents. Here is a list of these fields along with their descriptions and data types:
- `year` (integer): The year in which the incident occurred. Used as input in temporal analysis tasks.
- `city` (string): The city where the incident took place. This field is crucial for geographic analyses and comparisons between cities.
- `crime_major_category` (string): A broad categorization of the crime, used as input for crime pattern analysis and categorization tasks.
- `crime_specific_category` (string): More detailed classification of the crime, falling under the major category. This field allows for a finer-grained analysis of crime types.
- `latitude` (float) and `longitude` (float): Geographical coordinates pinpointing the location of the incident. These fields are essential for geospatial analysis.
- `occurance_time` (datetime): The beginning time of the incident, providing temporal context. These fields are used in analyses that require time-based information.
- `clear_status` (string): The resolution status of the case, such as whether it was cleared by arrest or remains under investigation. This field can be used to understand case outcomes.
- `incident_address` (string): The specific address where the incident occurred. This field adds a detailed spatial dimension to the data.
- `notes` (string): Additional remarks or details about the incident, like weapon usage or other relevant factors. This field provides supplementary information that may be relevant for certain analyses.
- `crime_severity` (string): This column categorizes crime_major_category into three categories ("Minor", "Moderate", "Severe") according to crime severity.
## Dataset Creation
### Curation Rationale
The dataset, covering police incidents in select North Carolina cities from 2015 to 2024, aims to aid crime research. It provides a long-term view of crime patterns and trends, useful for criminologists, sociologists, and public policy researchers. The comprehensive data enables analyses of crime evolution and its socio-economic correlations. It also supports the development of predictive models for law enforcement and policy planning. Additionally, the dataset's multi-city scope allows for comparative studies to understand unique challenges and inform localized crime prevention strategies.
### Source Data
Four datasets are primarily utilized as source data:
- **Cary**:
- [Cary Open Data Portal - CPD Incidents](https://data.townofcary.org/explore/dataset/cpd-incidents/information/?disjunctive.crime_category&disjunctive.crime_type&disjunctive.crimeday&disjunctive.district&disjunctive.offensecategory&disjunctive.violentproperty&disjunctive.total_incidents&disjunctive.year&sort=date_from)
- Details:
- Size: 116317 rows * 34 columns
- Column names: 'Crime Category', 'Crime Type', 'UCR', 'Map Reference',
'Incident Number', 'Begin Date Of Occurrence',
'Begin Time Of Occurrence', 'End Date Of Occurrence',
'End Time Of Occurrence', 'Crime Day', 'Geo Code', 'Location Category',
'District', 'Beat Number', 'Location', 'ID', 'Lat', 'Lon',
'Charge Count', 'Neighborhood ID', 'Apartment Complex',
'Residential Subdivision', 'Subdivision ID', 'Phx Activity Date',
'Phx Record Status', 'Phx Community', 'Phx Status', 'Record',
'Offense Category', 'Violent Property', 'timeframe', 'domestic',
'Total Incidents', 'Year'
- **Chapel Hill**:
- [Chapel Hill Open Data Portal - Police Incidents](https://opendata-townofchapelhill.hub.arcgis.com/datasets/a761c9be03ef474bbbf4a114778623c5/explore?filters=eyJEYXRlX29mX1JlcG9ydCI6WzEyNjIzMDYxNjAwMDAsMTY5ODcwOTA4MDAwMF19&showTable=true)
- Details:
- Size: 101828 rows * 19 columns
- Column names: 'Incident ID', 'Agency', 'Offense', 'Street', 'City', 'State', 'Zipcode', 'Date of Report', 'Date of Occurrence', 'Date Found', 'Reported As', 'Premise Description', 'Forcible', 'Weapon Description', 'Victim Age', 'Victim Race', 'Victim Gender', 'Latitude', 'Longitude'
- **Durham**:
- [Durham Open Data Portal - DPD Incidents UCR/NIBRS Reporting](https://live-durhamnc.opendata.arcgis.com/documents/DurhamNC::dpd-incidents-ucr-nibrs-reporting/about)
- Details:
- Size: 149924 rows * 16 columns
- Column names: 'Case Number', 'Report Date', 'Report Time', 'Status', 'Sequence',
'ATT/COM', 'UCR Code', 'Offense', 'Address', 'X', 'Y', 'District',
'Beat', 'Tract', 'Premise', 'Weapon'
- **Raleigh**:
- [Raleigh Open Data Portal - Police Incidents (NIBRS)](https://data.raleighnc.gov/datasets/ral::raleigh-police-incidents-nibrs/explore?filters=eyJyZXBvcnRlZF95ZWFyIjpbMjAyNCwyMDI0XX0%3D&location=35.779792%2C-78.678454%2C11.17&showTable=true)
- Details:
- Size: 493912 rows * 19 columns
- Column names: 'Case Number', 'Crime_Category', 'Crime Code', 'Crime Description', 'Crime Type', 'Reported Block Address', 'City of Incident', 'City', 'District', 'Reported Date', 'Reported Year', 'Reported Month', 'Reported Day', 'Reported Hour', 'Reported Day of Week', 'Latitude', 'Longitude', 'Agency', 'Updated_Date'
## Considerations for Using the Data
### Other Known Limitations
The interpretation rights of the dataset are reserved by the respective government authorities. It is subject to change, and the City of Raleigh, as an example, retains the right to modify or discontinue any of the data feeds at any given time. This includes the right to require termination of displaying, distributing, or using the data, for any reason, including but not limited to violations of the Terms of Use. Users should be aware that such changes can occur and that the dataset may evolve over time according to the decisions made by the governing bodies.
## Additional Information
### Dataset Curators
For detailed information regarding the individuals involved in collecting the dataset and their affiliations, as well as any funding details, interested parties are encouraged to directly contact the respective government offices.
### Licensing Information
The dataset from cities in North Carolina, including Cary, Chapel Hill, Durham, and Raleigh, is sourced from their respective open data portals, each with its specific licensing information and terms of use.
Cary's Open Data Portal operates under the Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license. This allows for both commercial and non-commercial use, distribution, and adaptation of the data, as long as proper attribution is given and any derived works are shared under the same terms.
Chapel Hill's Open Data Portal is governed by the Open Database License (ODbL) 1.0 License. This license permits both commercial and non-commercial use, distribution, and modification of the data. Users are required to attribute the data appropriately and, if the data is altered or transformed, or if new datasets are built upon it, the resulting work must be distributed under the same ODbL 1.0 License.
While Durham's Open Data Portals do not specify a particular license, its website states that the data is publicly accessible and can be freely used: "Durham's portal is dedicated to making all government data available for public use without restrictions".
Raleigh's Open Data Policy stipulates that by using data from their site, users agree to the terms and conditions described under the City of Raleigh Open Data Privacy Policy. The City of Raleigh makes no warranties regarding the completeness, accuracy, or timeliness of the data, and disclaims all express or implied warranties, including those of merchantability or fitness for a particular purpose. Users should be aware that the data format or schema may change as updates are made, and they use the information at their own risk.
### Contributions
Thanks to Town of Cary, City of Chapel Hill, City of Durham, and City of Raleigh for providing the raw dataset.
|
nguyenthanhdo/dummy_alpaca | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: model_input
dtype: string
- name: input
dtype: string
- name: model_output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 46208623
num_examples: 52002
download_size: 24247917
dataset_size: 46208623
---
# Dataset Card for "dummy_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BubbleJoe/bootstrap_sms | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 473170
num_examples: 1325
download_size: 106042
dataset_size: 473170
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bootstrap_sms"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
loubnabnl/kaggle_scripts_new_format_subset | ---
dataset_info:
features:
- name: file_id
dtype: string
- name: content
dtype: string
- name: local_path
dtype: string
- name: kaggle_dataset_name
dtype: string
- name: kaggle_dataset_owner
dtype: string
- name: kversion
dtype: string
- name: kversion_datasetsources
dtype: string
- name: dataset_versions
dtype: string
- name: datasets
dtype: string
- name: users
dtype: string
- name: script
dtype: string
- name: df_info
dtype: string
- name: has_data_info
dtype: bool
- name: nb_filenames
dtype: int64
- name: retreived_data_description
dtype: string
- name: script_nb_tokens
dtype: int64
- name: upvotes
dtype: int64
- name: tokens_description
dtype: int64
- name: tokens_script
dtype: int64
splits:
- name: train
num_bytes: 26174515828
num_examples: 1160428
download_size: 10883466302
dataset_size: 26174515828
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "kaggle_scripts_new_format_subset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/irelia_leagueoflegends | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of irelia (League of Legends)
This is the dataset of irelia (League of Legends), containing 30 images and their tags.
The core tags of this character are `long_hair, black_hair, breasts, hair_ornament, large_breasts, blue_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 42.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irelia_leagueoflegends/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 26.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irelia_leagueoflegends/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 66 | 48.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irelia_leagueoflegends/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 38.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irelia_leagueoflegends/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 66 | 64.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/irelia_leagueoflegends/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/irelia_leagueoflegends',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, solo, armor, looking_at_viewer, cleavage |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | armor | looking_at_viewer | cleavage |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-----------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X |
|
shunk031/cocostuff | ---
language:
- en
license: cc-by-4.0
tags:
- computer-vision
- object-detection
- ms-coco
datasets:
- stuff-thing
- stuff-only
metrics:
- accuracy
- iou
---
# Dataset Card for COCO-Stuff
[](https://github.com/shunk031/huggingface-datasets_cocostuff/actions/workflows/ci.yaml)
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Preprocessing](#dataset-preprocessing)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- Homepage: https://github.com/nightrome/cocostuff
- Repository: https://github.com/nightrome/cocostuff
- Paper (preprint): https://arxiv.org/abs/1612.03716
- Paper (CVPR2018): https://openaccess.thecvf.com/content_cvpr_2018/html/Caesar_COCO-Stuff_Thing_and_CVPR_2018_paper.html
### Dataset Summary
COCO-Stuff is the largest existing dataset with dense stuff and thing annotations.
From the paper:
> Semantic classes can be either things (objects with a well-defined shape, e.g. car, person) or stuff (amorphous background regions, e.g. grass, sky). While lots of classification and detection works focus on thing classes, less attention has been given to stuff classes. Nonetheless, stuff classes are important as they allow to explain important aspects of an image, including (1) scene type; (2) which thing classes are likely to be present and their location (through contextual reasoning); (3) physical attributes, material types and geometric properties of the scene. To understand stuff and things in context we introduce COCO-Stuff, which augments all 164K images of the COCO 2017 dataset with pixel-wise annotations for 91 stuff classes. We introduce an efficient stuff annotation protocol based on superpixels, which leverages the original thing annotations. We quantify the speed versus quality trade-off of our protocol and explore the relation between annotation time and boundary complexity. Furthermore, we use COCO-Stuff to analyze: (a) the importance of stuff and thing classes in terms of their surface cover and how frequently they are mentioned in image captions; (b) the spatial relations between stuff and things, highlighting the rich contextual relations that make our dataset unique; (c) the performance of a modern semantic segmentation method on stuff and thing classes, and whether stuff is easier to segment than things.
### Dataset Preprocessing
### Supported Tasks and Leaderboards
### Languages
All of annotations use English as primary language.
## Dataset Structure
### Data Instances
When loading a specific configuration, users has to append a version dependent suffix:
```python
from datasets import load_dataset
load_dataset("shunk031/cocostuff", "stuff-thing")
```
#### stuff-things
An example of looks as follows.
```json
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x480 at 0x7FCA033C9C40>,
'image_filename': '000000000009.jpg',
'image_id': '9',
'width': 640
'height': 480,
'objects': [
{
'object_id': '121',
'x': 0,
'y': 11,
'w': 640,
'h': 469,
'name': 'food-other'
},
{
'object_id': '143',
'x': 0,
'y': 0
'w': 640,
'h': 480,
'name': 'plastic'
},
{
'object_id': '165',
'x': 0,
'y': 0,
'w': 319,
'h': 118,
'name': 'table'
},
{
'object_id': '183',
'x': 0,
'y': 2,
'w': 631,
'h': 472,
'name': 'unknown-183'
}
],
'stuff_map': <PIL.PngImagePlugin.PngImageFile image mode=L size=640x480 at 0x7FCA0222D880>,
}
```
#### stuff-only
An example of looks as follows.
```json
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x480 at 0x7FCA033C9C40>,
'image_filename': '000000000009.jpg',
'image_id': '9',
'width': 640
'height': 480,
'objects': [
{
'object_id': '121',
'x': 0,
'y': 11,
'w': 640,
'h': 469,
'name': 'food-other'
},
{
'object_id': '143',
'x': 0,
'y': 0
'w': 640,
'h': 480,
'name': 'plastic'
},
{
'object_id': '165',
'x': 0,
'y': 0,
'w': 319,
'h': 118,
'name': 'table'
},
{
'object_id': '183',
'x': 0,
'y': 2,
'w': 631,
'h': 472,
'name': 'unknown-183'
}
]
}
```
### Data Fields
#### stuff-things
- `image`: A `PIL.Image.Image` object containing the image.
- `image_id`: Unique numeric ID of the image.
- `image_filename`: File name of the image.
- `width`: Image width.
- `height`: Image height.
- `stuff_map`: A `PIL.Image.Image` object containing the Stuff + thing PNG-style annotations
- `objects`: Holds a list of `Object` data classes:
- `object_id`: Unique numeric ID of the object.
- `x`: x coordinate of bounding box's top left corner.
- `y`: y coordinate of bounding box's top left corner.
- `w`: Bounding box width.
- `h`: Bounding box height.
- `name`: object name
#### stuff-only
- `image`: A `PIL.Image.Image` object containing the image.
- `image_id`: Unique numeric ID of the image.
- `image_filename`: File name of the image.
- `width`: Image width.
- `height`: Image height.
- `objects`: Holds a list of `Object` data classes:
- `object_id`: Unique numeric ID of the object.
- `x`: x coordinate of bounding box's top left corner.
- `y`: y coordinate of bounding box's top left corner.
- `w`: Bounding box width.
- `h`: Bounding box height.
- `name`: object name
### Data Splits
| name | train | validation |
|-------------|--------:|-----------:|
| stuff-thing | 118,280 | 5,000 |
| stuff-only | 118,280 | 5,000 |
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
From the paper:
> COCO-Stuff contains 172 classes: 80 thing, 91 stuff, and 1 class unlabeled. The 80 thing classes are the same as in COCO [35]. The 91 stuff classes are curated by an expert annotator. The class unlabeled is used in two situations: if a label does not belong to any of the 171 predefined classes, or if the annotator cannot infer the label of a pixel.
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
COCO-Stuff is a derivative work of the COCO dataset. The authors of COCO do not in any form endorse this work. Different licenses apply:
- COCO images: [Flickr Terms of use](http://cocodataset.org/#termsofuse)
- COCO annotations: [Creative Commons Attribution 4.0 License](http://cocodataset.org/#termsofuse)
- COCO-Stuff annotations & code: [Creative Commons Attribution 4.0 License](http://cocodataset.org/#termsofuse)
### Citation Information
```bibtex
@INPROCEEDINGS{caesar2018cvpr,
title={COCO-Stuff: Thing and stuff classes in context},
author={Caesar, Holger and Uijlings, Jasper and Ferrari, Vittorio},
booktitle={Computer vision and pattern recognition (CVPR), 2018 IEEE conference on},
organization={IEEE},
year={2018}
}
```
### Contributions
Thanks to [@nightrome](https://github.com/nightrome) for publishing the COCO-Stuff dataset.
|
liesvarastranta/arxiv_original_dataset | ---
license: cc-by-4.0
---
This is the same dataset as in the Kaggle (https://www.kaggle.com/datasets/Cornell-University/arxiv).
Changed from .json to .csv. |
MedRAG/wikipedia | ---
task_categories:
- question-answering
language:
- en
tags:
- medical
- question answering
- large language model
- retrieval-augmented generation
size_categories:
- 10M<n<100M
---
# The Wikipedia Corpus in MedRAG
This HF dataset contains the chunked snippets from the Wikipedia corpus used in [MedRAG](https://arxiv.org/abs/2402.13178). It can be used for medical Retrieval-Augmented Generation (RAG).
## News
- (02/26/2024) The "id" column has been reformatted. A new "wiki_id" column is added.
## Dataset Details
### Dataset Descriptions
As a large-scale open-source encyclopedia, Wikipedia is frequently used as a corpus in information retrieval tasks.
We select Wikipedia as one of the corpora to see if the general domain database can be used to improve the ability of medical QA.
We downloaded the processed Wikipedia data from [HuggingFace](https://huggingface.co/datasets/wikipedia) and chunked the text using [LangChain](https://www.langchain.com/) as snippets with no more than 1000 characters.
This HF dataset contains our ready-to-use chunked snippets for the Wikipedia corpus, including 29,913,202 snippets with an average of 162 tokens.
### Dataset Structure
Each row is a snippet of Wikipedia, which includes the following features:
- id: a unique identifier of the snippet
- title: the title of the Wikipedia article from which the snippet is collected
- content: the content of the snippet
- contents: a concatenation of 'title' and 'content', which will be used by the [BM25](https://github.com/castorini/pyserini) retriever
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
```shell
git clone https://huggingface.co/datasets/MedRAG/wikipedia
```
### Use in MedRAG
```python
>> from src.medrag import MedRAG
>> question = "A lesion causing compression of the facial nerve at the stylomastoid foramen will cause ipsilateral"
>> options = {
"A": "paralysis of the facial muscles.",
"B": "paralysis of the facial muscles and loss of taste.",
"C": "paralysis of the facial muscles, loss of taste and lacrimation.",
"D": "paralysis of the facial muscles, loss of taste, lacrimation and decreased salivation."
}
>> medrag = MedRAG(llm_name="OpenAI/gpt-3.5-turbo-16k", rag=True, retriever_name="MedCPT", corpus_name="Wikipedia")
>> answer, snippets, scores = medrag.answer(question=question, options=options, k=32) # scores are given by the retrieval system
```
## Citation
```shell
@article{xiong2024benchmarking,
title={Benchmarking Retrieval-Augmented Generation for Medicine},
author={Guangzhi Xiong and Qiao Jin and Zhiyong Lu and Aidong Zhang},
journal={arXiv preprint arXiv:2402.13178},
year={2024}
}
``` |
mgreg555/Little_Prince | ---
license: unknown
---
|
dvilasuero/bankingapp_sentiment | ---
dataset_info:
features:
- name: text
dtype: string
- name: inputs
struct:
- name: text
dtype: string
- name: prediction
dtype: 'null'
- name: prediction_agent
dtype: 'null'
- name: annotation
dtype: string
- name: annotation_agent
dtype: string
- name: multi_label
dtype: bool
- name: explanation
dtype: 'null'
- name: id
dtype: string
- name: metadata
dtype: 'null'
- name: status
dtype: string
- name: event_timestamp
dtype: 'null'
- name: metrics
struct:
- name: text_length
dtype: int64
splits:
- name: train
num_bytes: 163514
num_examples: 1000
download_size: 79893
dataset_size: 163514
---
# Dataset Card for "bankingapp_sentiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangjinlong/gz | ---
license: mit
---
|
khoomeik/gzipscale-0.40-30_200_15_20-100M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 337227221
num_examples: 390625
download_size: 91592951
dataset_size: 337227221
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ruanchaves/boun | ---
annotations_creators:
- expert-generated
language_creators:
- machine-generated
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- unknown
source_datasets:
- original
task_categories:
- structure-prediction
task_ids: []
pretty_name: BOUN
tags:
- word-segmentation
---
# Dataset Card for BOUN
## Dataset Description
- **Repository:** [ardax/hashtag-segmentor](https://github.com/ardax/hashtag-segmentor)
- **Paper:** [Segmenting Hashtags and Analyzing Their Grammatical Structure](https://asistdl.onlinelibrary.wiley.com/doi/epdf/10.1002/asi.23989?author_access_token=qbKcE1jrre5nbv_Tn9csbU4keas67K9QMdWULTWMo8NOtY2aA39ck2w5Sm4ePQ1MZhbjCdEuaRlPEw2Kd12jzvwhwoWP0fdroZAwWsmXHPXxryDk_oBCup1i9_VDNIpU)
### Dataset Summary
Dev-BOUN is a Development set that includes 500 manually segmented hashtags. These are selected from tweets about movies,
tv shows, popular people, sports teams etc.
Test-BOUN is a Test set that includes 500 manually segmented hashtags. These are selected from tweets about movies, tv shows, popular people, sports teams etc.
### Languages
English
## Dataset Structure
### Data Instances
```
{
"index": 0,
"hashtag": "tryingtosleep",
"segmentation": "trying to sleep"
}
```
### Data Fields
- `index`: a numerical index.
- `hashtag`: the original hashtag.
- `segmentation`: the gold segmentation for the hashtag.
## Dataset Creation
- All hashtag segmentation and identifier splitting datasets on this profile have the same basic fields: `hashtag` and `segmentation` or `identifier` and `segmentation`.
- The only difference between `hashtag` and `segmentation` or between `identifier` and `segmentation` are the whitespace characters. Spell checking, expanding abbreviations or correcting characters to uppercase go into other fields.
- There is always whitespace between an alphanumeric character and a sequence of any special characters ( such as `_` , `:`, `~` ).
- If there are any annotations for named entity recognition and other token classification tasks, they are given in a `spans` field.
## Additional Information
### Citation Information
```
@article{celebi2018segmenting,
title={Segmenting hashtags and analyzing their grammatical structure},
author={Celebi, Arda and {\"O}zg{\"u}r, Arzucan},
journal={Journal of the Association for Information Science and Technology},
volume={69},
number={5},
pages={675--686},
year={2018},
publisher={Wiley Online Library}
}
```
### Contributions
This dataset was added by [@ruanchaves](https://github.com/ruanchaves) while developing the [hashformers](https://github.com/ruanchaves/hashformers) library. |
CyberHarem/alice_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of alice/アリス/爱丽丝/앨리스 (Nikke: Goddess of Victory)
This is the dataset of alice/アリス/爱丽丝/앨리스 (Nikke: Goddess of Victory), containing 416 images and their tags.
The core tags of this character are `long_hair, twintails, breasts, animal_ears, pink_eyes, headphones, fake_animal_ears, bangs, animal_ear_headphones, white_hair, medium_breasts, large_breasts, pink_hair, very_long_hair, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 416 | 810.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 416 | 388.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1140 | 926.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 416 | 685.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1140 | 1.38 GiB | [Download](https://huggingface.co/datasets/CyberHarem/alice_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/alice_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, ass, from_behind, jacket, looking_at_viewer, looking_back, pink_bodysuit, skin_tight, solo, thighs, headset, blush, closed_mouth, long_sleeves, white_gloves, shiny_clothes, latex, ponytail, smile |
| 1 | 27 |  |  |  |  |  | 1girl, looking_at_viewer, pink_bodysuit, skin_tight, solo, headset, blush, long_sleeves, latex_bodysuit, open_mouth, pink_gloves, covered_navel, impossible_bodysuit, simple_background, :d, shrug_(clothing), white_background, cropped_jacket, red_jacket, cowboy_shot |
| 2 | 9 |  |  |  |  |  | 1girl, cropped_jacket, headset, pink_bodysuit, skin_tight, solo, latex_bodysuit, red_jacket, shrug_(clothing), impossible_bodysuit, looking_at_viewer, sneakers, long_sleeves, pink_gloves, shiny_clothes, socks, white_footwear, blush, grey_hair, holding_gun, rifle, simple_background, squatting, white_background, full_body |
| 3 | 8 |  |  |  |  |  | 1girl, cropped_jacket, headset, latex_bodysuit, looking_at_viewer, open_mouth, pink_bodysuit, red_jacket, shiny_clothes, skin_tight, solo, blush, covered_nipples, impossible_bodysuit, long_sleeves, sneakers, snowing, white_footwear, covered_navel, grey_hair, outdoors, :d, cameltoe, animal, full_body, mountain, pink_gloves, snowflakes, thighs, white_gloves, ass, multicolored_clothes, purple_eyes, rabbit, salute, white_socks |
| 4 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, open_mouth, solo_focus, blush, headset, holding_hands, interlocked_fingers, mosaic_censoring, penis, long_sleeves, nipples, vaginal, clothed_sex, covered_navel, pink_bodysuit, skin_tight, smile, gloves, red_jacket, shiny, shrug_(clothing), stomach |
| 5 | 9 |  |  |  |  |  | playboy_bunny, rabbit_ears, cleavage, looking_at_viewer, 1girl, smile, strapless_leotard, detached_collar, open_mouth, pantyhose, rabbit_tail, solo, bowtie, pink_leotard, wrist_cuffs, bare_shoulders, blush, cowboy_shot, fake_tail, teeth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | ass | from_behind | jacket | looking_at_viewer | looking_back | pink_bodysuit | skin_tight | solo | thighs | headset | blush | closed_mouth | long_sleeves | white_gloves | shiny_clothes | latex | ponytail | smile | latex_bodysuit | open_mouth | pink_gloves | covered_navel | impossible_bodysuit | simple_background | :d | shrug_(clothing) | white_background | cropped_jacket | red_jacket | cowboy_shot | sneakers | socks | white_footwear | grey_hair | holding_gun | rifle | squatting | full_body | covered_nipples | snowing | outdoors | cameltoe | animal | mountain | snowflakes | multicolored_clothes | purple_eyes | rabbit | salute | white_socks | 1boy | hetero | solo_focus | holding_hands | interlocked_fingers | mosaic_censoring | penis | nipples | vaginal | clothed_sex | gloves | shiny | stomach | playboy_bunny | rabbit_ears | cleavage | strapless_leotard | detached_collar | pantyhose | rabbit_tail | bowtie | pink_leotard | wrist_cuffs | bare_shoulders | fake_tail | teeth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------|:--------------|:---------|:--------------------|:---------------|:----------------|:-------------|:-------|:---------|:----------|:--------|:---------------|:---------------|:---------------|:----------------|:--------|:-----------|:--------|:-----------------|:-------------|:--------------|:----------------|:----------------------|:--------------------|:-----|:-------------------|:-------------------|:-----------------|:-------------|:--------------|:-----------|:--------|:-----------------|:------------|:--------------|:--------|:------------|:------------|:------------------|:----------|:-----------|:-----------|:---------|:-----------|:-------------|:-----------------------|:--------------|:---------|:---------|:--------------|:-------|:---------|:-------------|:----------------|:----------------------|:-------------------|:--------|:----------|:----------|:--------------|:---------|:--------|:----------|:----------------|:--------------|:-----------|:--------------------|:------------------|:------------|:--------------|:---------|:---------------|:--------------|:-----------------|:------------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 27 |  |  |  |  |  | X | | | | X | | X | X | X | | X | X | | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | | | X | | X | X | X | | X | X | | X | | X | | | | X | | X | | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | | X | | X | X | X | X | X | X | | X | X | X | | | | X | X | X | X | X | | X | | | X | X | | X | | X | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | X | | | | | | X | X | | | X | X | | X | | | | | X | | X | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | | X | | | | X | | | X | | | | | | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kanemitsukun/facade_of_kyoto | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_sst2_serial_verb_give | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: test
num_bytes: 225
num_examples: 1
- name: train
num_bytes: 4532
num_examples: 31
download_size: 6577
dataset_size: 4757
---
# Dataset Card for "MULTI_VALUE_sst2_serial_verb_give"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DeepLearner101/ImageNetCIFAR100MappedSubset | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: int64
splits:
- name: train
num_bytes: 65872065.0
num_examples: 1760
- name: validation
num_bytes: 20151623.0
num_examples: 550
download_size: 177372502
dataset_size: 86023688.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
ChrisWilson/twitter_dataset_1712970766 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 18977
num_examples: 42
download_size: 13992
dataset_size: 18977
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B | ---
pretty_name: Evaluation run of Weyaxi/Platypus-Nebula-v2-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Weyaxi/Platypus-Nebula-v2-7B](https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T11:25:54.972492](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B/blob/main/results_2023-12-04T11-25-54.972492.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5564269677143451,\n\
\ \"acc_stderr\": 0.03374811024697019,\n \"acc_norm\": 0.5651516514420288,\n\
\ \"acc_norm_stderr\": 0.03451915951620442,\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113502,\n \"mc2\": 0.4693887506938676,\n\
\ \"mc2_stderr\": 0.015134250861855079\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5213310580204779,\n \"acc_stderr\": 0.014598087973127108,\n\
\ \"acc_norm\": 0.5537542662116041,\n \"acc_norm_stderr\": 0.014526705548539982\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6377215694084843,\n\
\ \"acc_stderr\": 0.004796763521045228,\n \"acc_norm\": 0.8302131049591714,\n\
\ \"acc_norm_stderr\": 0.003746781712509652\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960718,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.49361702127659574,\n \"acc_stderr\": 0.03268335899936337,\n\
\ \"acc_norm\": 0.49361702127659574,\n \"acc_norm_stderr\": 0.03268335899936337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6612903225806451,\n \"acc_stderr\": 0.026923446059302844,\n \"\
acc_norm\": 0.6612903225806451,\n \"acc_norm_stderr\": 0.026923446059302844\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933882,\n\
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933882\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145654,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145654\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.0323854694875898,\n \
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.0323854694875898\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.0386155754625517,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.0386155754625517\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763079,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763079\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7009803921568627,\n \"acc_stderr\": 0.03213325717373617,\n\
\ \"acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677697,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677697\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.01483620516733356,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.01483620516733356\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.02648339204209818,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.02648339204209818\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348402,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348402\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6045751633986928,\n \"acc_stderr\": 0.027996723180631438,\n\
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.027996723180631438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n\
\ \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555404,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555404\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.032744852119469564,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.032744852119469564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3268053855569155,\n\
\ \"mc1_stderr\": 0.01641987473113502,\n \"mc2\": 0.4693887506938676,\n\
\ \"mc2_stderr\": 0.015134250861855079\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.01258891818387159\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10083396512509477,\n \
\ \"acc_stderr\": 0.008294031192126605\n }\n}\n```"
repo_url: https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|arc:challenge|25_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|gsm8k|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hellaswag|10_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T11-25-54.972492.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- '**/details_harness|winogrande|5_2023-12-04T11-25-54.972492.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T11-25-54.972492.parquet'
- config_name: results
data_files:
- split: 2023_12_04T11_25_54.972492
path:
- results_2023-12-04T11-25-54.972492.parquet
- split: latest
path:
- results_2023-12-04T11-25-54.972492.parquet
---
# Dataset Card for Evaluation run of Weyaxi/Platypus-Nebula-v2-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Weyaxi/Platypus-Nebula-v2-7B](https://huggingface.co/Weyaxi/Platypus-Nebula-v2-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T11:25:54.972492](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Platypus-Nebula-v2-7B/blob/main/results_2023-12-04T11-25-54.972492.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5564269677143451,
"acc_stderr": 0.03374811024697019,
"acc_norm": 0.5651516514420288,
"acc_norm_stderr": 0.03451915951620442,
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113502,
"mc2": 0.4693887506938676,
"mc2_stderr": 0.015134250861855079
},
"harness|arc:challenge|25": {
"acc": 0.5213310580204779,
"acc_stderr": 0.014598087973127108,
"acc_norm": 0.5537542662116041,
"acc_norm_stderr": 0.014526705548539982
},
"harness|hellaswag|10": {
"acc": 0.6377215694084843,
"acc_stderr": 0.004796763521045228,
"acc_norm": 0.8302131049591714,
"acc_norm_stderr": 0.003746781712509652
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960718,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.49361702127659574,
"acc_stderr": 0.03268335899936337,
"acc_norm": 0.49361702127659574,
"acc_norm_stderr": 0.03268335899936337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.026923446059302844,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.026923446059302844
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.025310639254933882,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.025310639254933882
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145654,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145654
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.0323854694875898,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.0323854694875898
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.0386155754625517,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.0386155754625517
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763079,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763079
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677697,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677697
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.01483620516733356,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.01483620516733356
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.02648339204209818,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.02648339204209818
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348402,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348402
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.027996723180631438,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.027996723180631438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5020408163265306,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.5020408163265306,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555404,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555404
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.032744852119469564,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.032744852119469564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3268053855569155,
"mc1_stderr": 0.01641987473113502,
"mc2": 0.4693887506938676,
"mc2_stderr": 0.015134250861855079
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.01258891818387159
},
"harness|gsm8k|5": {
"acc": 0.10083396512509477,
"acc_stderr": 0.008294031192126605
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TrainingDataPro/anti-spoofing-real-waist-high-dataset | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
- image-to-image
language:
- en
tags:
- legal
dataset_info:
features:
- name: photo
dtype: image
- name: video
dtype: string
- name: phone
dtype: string
- name: gender
dtype: string
- name: age
dtype: int8
- name: country
dtype: string
splits:
- name: train
num_bytes: 34728975
num_examples: 8
download_size: 195022198
dataset_size: 34728975
---
# Anti-Spoofing Real Waist-High Dataset
The dataset consists of waist-high selfies and video of real people. The dataset solves tasks in the field of anti-spoofing and it is useful for buisness and safety systems.
### The dataset includes 2 different types of files:
- **Photo** - a selfie of a person from a mobile phone, the person is depicted alone on it, the face is clearly visible. Person is presented waist-high.
- **Video** - filmed on the front camera, on which a person moves his/her head left, right, up and down. Duration of the video is from 10 to 20 seconds.

# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/anti-spoofing-real?utm_source=huggingface&utm_medium=cpc&utm_campaign=anti-spoofing-real-waist-high-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Content
- The folder **"photo"** includes selfies of people
- The folder **"video"** includes videos of people
### File with the extension .csv
includes the following information for each media file:
- **photo**: link to access the selfie,
- **video**: link to access the video,
- **phone**: the device used to capture selfie and video,
- **gender**: gender of a person,
- **age**: age of the person,
- **country**: country of the person
## [**TrainingData**](https://trainingdata.pro/data-market/anti-spoofing-real?utm_source=huggingface&utm_medium=cpc&utm_campaign=anti-spoofing-real-waist-high-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
tyzhu/lmind_nq_train6000_eval6489_v1_reciteonly_qa_v1 | ---
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
- name: true_doc
dtype: string
splits:
- name: train_qa
num_bytes: 1057233
num_examples: 6000
- name: train_ic_qa
num_bytes: 4900402
num_examples: 6000
- name: train_recite_qa
num_bytes: 8155705
num_examples: 6000
- name: eval_qa
num_bytes: 1142332
num_examples: 6489
- name: eval_ic_qa
num_bytes: 5295716
num_examples: 6489
- name: eval_recite_qa
num_bytes: 8812988
num_examples: 6489
- name: all_docs
num_bytes: 7497763
num_examples: 10925
- name: all_docs_eval
num_bytes: 14017729
num_examples: 10925
- name: train
num_bytes: 8155705
num_examples: 6000
- name: validation
num_bytes: 8812988
num_examples: 6489
download_size: 42116704
dataset_size: 67848561
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_ic_qa
path: data/train_ic_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_ic_qa
path: data/eval_ic_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/3eeea607 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 182
num_examples: 10
download_size: 1338
dataset_size: 182
---
# Dataset Card for "3eeea607"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alkahestry/reward-rpio | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 2508025
num_examples: 3146
download_size: 1509167
dataset_size: 2508025
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "reward-rpio"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abercowsky/autotrain-data-sexual-content-classification | ---
language:
- en
task_categories:
- text-classification
---
# AutoTrain Dataset for project: sexual-content-classification
## Dataset Description
This dataset has been automatically processed by AutoTrain for project sexual-content-classification.
### Languages
The BCP-47 code for the dataset's language is en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "You're No Good: was covered in a College babes fucks with the neighbor charting version by which soul singer and pianist?",
"target": 1
},
{
"text": "AT first \u201cthe button\u201d seemed like an April Fools\u2019 joke.\n\nNow, 13 days later, Reddit\u2019s social experiment is still holding momentum.\n\nThe button feature was added to Reddit on April 1 and contains a timer which counts down from 60 seconds to zero.\n\nHowever, every time the button is pushed, the timer is reset.\n\nAlthough Reddit users have been speculating the reason for the experiment, no one knows its specific purpose.\n\nAdditionally, no one is aware what will happen when the countdown reaches zero because the timer is yet to fall below 29 seconds.\n\nRedditors can only use the feature if they were a member of the website before April 1 and they can only push the button once.\n\nAs of this afternoon, over 711,000 members have pushed the button.\n\nSince its inception, members have received coloured circles next to their username which indicate how long they waited to push the button.\n\nThose who don\u2019t push the button receive a grey circle, while those who give in to temptation receive circles ranging from purple all the way down to red.\n\nTo date, no one has waited past the time restrictions of yellow meaning there are no orange or red circles floating around Reddit.\n\nHowever, one can only assume interest will eventually disappear and the true purpose of the button will be revealed.\n\nThink about this: for the past 12 days someone on Earth has pressed a button every 30 sec or so. http://t.co/7ALLYeWB7i #TheButton @reddit \u2014 Zach's Mind (@ZachsMind) April 13, 2015\n\nThe fact that I've been watching #thebutton for 11 days is starting to concern me. \u2014 ConvertToChris (@converttochris) April 12, 2015\n\nI only date people who have not pressed #TheButton \u2014 Pyro (@Pyrao) April 11, 2015",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"target": "ClassLabel(names=['0', '1'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 568324 |
| valid | 142082 |
|
open-llm-leaderboard/details_sethuiyer__CodeCalc-Mistral-7B | ---
pretty_name: Evaluation run of sethuiyer/CodeCalc-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [sethuiyer/CodeCalc-Mistral-7B](https://huggingface.co/sethuiyer/CodeCalc-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_sethuiyer__CodeCalc-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T14:26:23.871957](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__CodeCalc-Mistral-7B/blob/main/results_2024-02-19T14-26-23.871957.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6299053671856149,\n\
\ \"acc_stderr\": 0.03240187831591768,\n \"acc_norm\": 0.6312070302631679,\n\
\ \"acc_norm_stderr\": 0.033056619058711406,\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698303,\n \"mc2\": 0.4778512417068049,\n\
\ \"mc2_stderr\": 0.015210259737289735\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5810580204778157,\n \"acc_stderr\": 0.014418106953639013,\n\
\ \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349812\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6395140410276837,\n\
\ \"acc_stderr\": 0.004791601975612765,\n \"acc_norm\": 0.836387173869747,\n\
\ \"acc_norm_stderr\": 0.0036916784957679717\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.02475747390275206,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.02475747390275206\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"\
acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217483,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217483\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.02338193534812143,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.02338193534812143\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513535,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513535\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8201834862385321,\n \"acc_stderr\": 0.01646534546739152,\n \"\
acc_norm\": 0.8201834862385321,\n \"acc_norm_stderr\": 0.01646534546739152\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.02955429260569507,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.02955429260569507\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.027820781981149685,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.027820781981149685\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757433,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757433\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0246853168672578,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0246853168672578\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.01618544417945717,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.01618544417945717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n\
\ \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n\
\ \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.012729785386598566,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.012729785386598566\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031215,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031215\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304335,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304335\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33047735618115054,\n\
\ \"mc1_stderr\": 0.016466769613698303,\n \"mc2\": 0.4778512417068049,\n\
\ \"mc2_stderr\": 0.015210259737289735\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.01158587171020941\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \
\ \"acc_stderr\": 0.013258428375662245\n }\n}\n```"
repo_url: https://huggingface.co/sethuiyer/CodeCalc-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|arc:challenge|25_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|gsm8k|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hellaswag|10_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T14-26-23.871957.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T14-26-23.871957.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- '**/details_harness|winogrande|5_2024-02-19T14-26-23.871957.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T14-26-23.871957.parquet'
- config_name: results
data_files:
- split: 2024_02_19T14_26_23.871957
path:
- results_2024-02-19T14-26-23.871957.parquet
- split: latest
path:
- results_2024-02-19T14-26-23.871957.parquet
---
# Dataset Card for Evaluation run of sethuiyer/CodeCalc-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [sethuiyer/CodeCalc-Mistral-7B](https://huggingface.co/sethuiyer/CodeCalc-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_sethuiyer__CodeCalc-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T14:26:23.871957](https://huggingface.co/datasets/open-llm-leaderboard/details_sethuiyer__CodeCalc-Mistral-7B/blob/main/results_2024-02-19T14-26-23.871957.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6299053671856149,
"acc_stderr": 0.03240187831591768,
"acc_norm": 0.6312070302631679,
"acc_norm_stderr": 0.033056619058711406,
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698303,
"mc2": 0.4778512417068049,
"mc2_stderr": 0.015210259737289735
},
"harness|arc:challenge|25": {
"acc": 0.5810580204778157,
"acc_stderr": 0.014418106953639013,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349812
},
"harness|hellaswag|10": {
"acc": 0.6395140410276837,
"acc_stderr": 0.004791601975612765,
"acc_norm": 0.836387173869747,
"acc_norm_stderr": 0.0036916784957679717
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.02475747390275206,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.02475747390275206
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217483,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217483
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.02338193534812143,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.02338193534812143
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513535,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513535
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8201834862385321,
"acc_stderr": 0.01646534546739152,
"acc_norm": 0.8201834862385321,
"acc_norm_stderr": 0.01646534546739152
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.02955429260569507,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.02955429260569507
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.027820781981149685,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.027820781981149685
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757433,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0246853168672578,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0246853168672578
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.01618544417945717,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.01618544417945717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598566,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598566
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031215,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031215
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304335,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304335
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33047735618115054,
"mc1_stderr": 0.016466769613698303,
"mc2": 0.4778512417068049,
"mc2_stderr": 0.015210259737289735
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.01158587171020941
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662245
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/senkawa_chihiro_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of senkawa_chihiro/千川ちひろ/센카와치히로 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of senkawa_chihiro/千川ちひろ/센카와치히로 (THE iDOLM@STER: Cinderella Girls), containing 291 images and their tags.
The core tags of this character are `brown_hair, braid, long_hair, single_braid, hair_over_shoulder, breasts, scrunchie, brown_eyes, hair_scrunchie, hair_ornament, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 291 | 278.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senkawa_chihiro_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 291 | 182.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senkawa_chihiro_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 641 | 363.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senkawa_chihiro_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 291 | 252.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senkawa_chihiro_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 641 | 479.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/senkawa_chihiro_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/senkawa_chihiro_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, long_sleeves, solo, black_skirt, blush, collared_shirt, green_jacket, looking_at_viewer, pencil_skirt, red_scrunchie, white_shirt, yellow_necktie, black_pantyhose, office_lady, simple_background, white_background, :d, miniskirt, open_mouth, closed_mouth, dress_shirt, holding, name_tag |
| 1 | 9 |  |  |  |  |  | 1girl, necktie, smile, solo, blush, open_mouth, looking_at_viewer |
| 2 | 14 |  |  |  |  |  | 1girl, blush, navel, solo, smile, green_bikini, looking_at_viewer, open_mouth, large_breasts, cleavage, frilled_bikini, side-tie_bikini_bottom |
| 3 | 6 |  |  |  |  |  | 1girl, blush, cleavage, medium_breasts, solo, looking_at_viewer, green_bra, navel, smile, black_thighhighs, green_panties, open_shirt, sitting |
| 4 | 5 |  |  |  |  |  | 1girl, black_pantyhose, black_skirt, bralines, office_lady, pencil_skirt, solo, white_shirt, ass, bra_visible_through_clothes, from_behind, long_sleeves, closed_eyes, high-waist_skirt, indoors, pantylines, see-through, facing_away, handbag, lanyard, sleeping |
| 5 | 8 |  |  |  |  |  | 1girl, detached_collar, playboy_bunny, rabbit_ears, medium_breasts, solo, wrist_cuffs, blush, bowtie, cleavage, fishnet_pantyhose, looking_at_viewer, black_pantyhose, rabbit_tail, strapless_leotard, bare_shoulders, black_leotard, fake_animal_ears, open_mouth, white_background, simple_background, smile |
| 6 | 5 |  |  |  |  |  | bare_shoulders, collarbone, green_dress, strapless_dress, 1girl, bare_arms, blush, looking_at_viewer, pearl_necklace, solo, bow, cleavage, hands_up, medium_breasts, open_mouth, orange_eyes, red_scrunchie, white_background, :d, bead_necklace, closed_mouth, cowboy_shot, gradient_background, hair_between_eyes, large_breasts, own_hands_together, purple_rose, sash, sparkle, standing, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | solo | black_skirt | blush | collared_shirt | green_jacket | looking_at_viewer | pencil_skirt | red_scrunchie | white_shirt | yellow_necktie | black_pantyhose | office_lady | simple_background | white_background | :d | miniskirt | open_mouth | closed_mouth | dress_shirt | holding | name_tag | necktie | smile | navel | green_bikini | large_breasts | cleavage | frilled_bikini | side-tie_bikini_bottom | medium_breasts | green_bra | black_thighhighs | green_panties | open_shirt | sitting | bralines | ass | bra_visible_through_clothes | from_behind | closed_eyes | high-waist_skirt | indoors | pantylines | see-through | facing_away | handbag | lanyard | sleeping | detached_collar | playboy_bunny | rabbit_ears | wrist_cuffs | bowtie | fishnet_pantyhose | rabbit_tail | strapless_leotard | bare_shoulders | black_leotard | fake_animal_ears | collarbone | green_dress | strapless_dress | bare_arms | pearl_necklace | bow | hands_up | orange_eyes | bead_necklace | cowboy_shot | gradient_background | hair_between_eyes | own_hands_together | purple_rose | sash | sparkle | standing | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-------|:--------------|:--------|:-----------------|:---------------|:--------------------|:---------------|:----------------|:--------------|:-----------------|:------------------|:--------------|:--------------------|:-------------------|:-----|:------------|:-------------|:---------------|:--------------|:----------|:-----------|:----------|:--------|:--------|:---------------|:----------------|:-----------|:-----------------|:-------------------------|:-----------------|:------------|:-------------------|:----------------|:-------------|:----------|:-----------|:------|:------------------------------|:--------------|:--------------|:-------------------|:----------|:-------------|:--------------|:--------------|:----------|:----------|:-----------|:------------------|:----------------|:--------------|:--------------|:---------|:--------------------|:--------------|:--------------------|:-----------------|:----------------|:-------------------|:-------------|:--------------|:------------------|:------------|:-----------------|:------|:-----------|:--------------|:----------------|:--------------|:----------------------|:--------------------|:---------------------|:--------------|:-------|:----------|:-----------|:-------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | X | | | X | | | | | | | | | | | | | | | | | X | X | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | | | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | | X | | | X | | | | | X | | X | X | | | X | | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | X | | X | | | X | | X | | | | | | X | X | | X | X | | | | | | | | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
sam-mosaic/evol_chat | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 146959707.60431653
num_examples: 69756
- name: test
num_bytes: 632402.3357142857
num_examples: 300
download_size: 71104381
dataset_size: 147592109.9400308
---
# Dataset Card for "evol_chat"
ChatML-formatted version of [Evol Instruct](https://huggingface.co/datasets/victor123/evol_instruct_70k) |
smangrul/ultrachat-feedback-10k-chatml | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
splits:
- name: train
num_bytes: 65996149
num_examples: 10000
- name: test
num_bytes: 13161585
num_examples: 2000
download_size: 44057628
dataset_size: 79157734
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
xlajitx/Mylora | ---
license: unknown
---
|
zolak/twitter_dataset_79_1713226705 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 131719
num_examples: 334
download_size: 75059
dataset_size: 131719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yuvalkirstain/beautiful_interesting_spectacular_photo_dog_25000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: pclean
dtype: float64
splits:
- name: train
num_bytes: 361773346.0
num_examples: 504
download_size: 361776700
dataset_size: 361773346.0
---
# Dataset Card for "beautiful_interesting_spectacular_photo_dog_25000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_224 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 905866800
num_examples: 177900
download_size: 924915778
dataset_size: 905866800
---
# Dataset Card for "chunk_224"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pimentooliver/fungi | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 41590978.0
num_examples: 841
download_size: 40501239
dataset_size: 41590978.0
---
# Dataset Card for "fungi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.