datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
thanhduycao/vivos_ng_only | ---
dataset_info:
features:
- name: speaker_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 8858003.732418524
num_examples: 60
- name: test
num_bytes: 453063.74736842106
num_examples: 4
download_size: 717197
dataset_size: 9311067.479786946
---
# Dataset Card for "vivos_ng_only"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haydenbanz/Toxic_Plant_Classification | ---
license: mit
---
|
octoz/domingo | ---
license: openrail
---
|
St4n/new_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: file_name
dtype: string
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 25020
num_examples: 100
download_size: 0
dataset_size: 25020
---
# Dataset Card for "new_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1712956909 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 11314
num_examples: 25
download_size: 9824
dataset_size: 11314
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1712956909"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Azure99/blossom-chat-v3 | ---
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
language:
- zh
- en
size_categories:
- 10K<n<100K
---
# BLOSSOM CHAT V3
### 介绍
Blossom Chat V3是基于ShareGPT 90K衍生而来的中英双语对话数据集,适用于多轮对话微调。
相比于blossom-chat-v2,本版本完全使用GPT-4进行蒸馏
本数据集抽取了ShareGPT的多轮对话指令,仅将指令进行翻译,随后使用多轮指令迭代调用gpt-4-0125-preview。
相比原始的ShareGPT数据,主要解决了中文对话数据量较少,以及由ChatGPT生成长度限制而导致的输出截断问题。
本次发布了全量数据的50%,包含5K记录。
### 语言
以中文和英文为主,中英文数据按照约1:1的比例混合。
### 数据集结构
每条数据代表一个完整的多轮对话,包含id和conversations两个字段。
- id:从1递增。
- conversations:对象数组,每个对象包含role、content两个字段,role的取值为user或assistant,分别代表用户输入和助手输出,content则为对应的内容。
### 数据集限制
由于仅抽取了原始多轮对话的输入,对于一些涉及随机性的对话,例如:猜随机数,可能会出现多轮对话不连贯的情况。
此外,本数据集的所有响应均由gpt-4-0125-preview生成,并未经过严格的数据校验,可能包含不准确甚至严重错误的回答。 |
Thouph/D3C | ---
license: apache-2.0
---
|
RIW/small_coco_test_100 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: url
dtype: string
- name: key
dtype: string
- name: status
dtype: string
- name: error_message
dtype: 'null'
- name: width
dtype: int64
- name: height
dtype: int64
- name: original_width
dtype: int64
- name: original_height
dtype: int64
- name: exif
dtype: string
- name: sha256
dtype: string
splits:
- name: train
num_bytes: 885002400.915
num_examples: 8965
- name: validation
num_bytes: 885002400.915
num_examples: 8965
download_size: 370625886
dataset_size: 1770004801.83
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2.1-mistral-7b | ---
pretty_name: Evaluation run of cognitivecomputations/dolphin-2.2.1-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/dolphin-2.2.1-mistral-7b](https://huggingface.co/cognitivecomputations/dolphin-2.2.1-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2.1-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-04T12:00:28.671767](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2.1-mistral-7b/blob/main/results_2024-01-04T12-00-28.671767.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6314773728091061,\n\
\ \"acc_stderr\": 0.032247020008011884,\n \"acc_norm\": 0.6351477685860653,\n\
\ \"acc_norm_stderr\": 0.03288962606766026,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5314305414143765,\n\
\ \"mc2_stderr\": 0.015039173098592665\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.60580204778157,\n \"acc_stderr\": 0.014280522667467323,\n\
\ \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.01409099561816848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6432981477793268,\n\
\ \"acc_stderr\": 0.004780467270911771,\n \"acc_norm\": 0.8379804819757021,\n\
\ \"acc_norm_stderr\": 0.00367715668784884\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880263,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880263\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340354,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340354\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377561,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377561\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.02366421667164251,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.02366421667164251\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175007,\n \"\
acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098823,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098823\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577612,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n\
\ \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n\
\ \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4517601043024772,\n\
\ \"acc_stderr\": 0.012710662233660247,\n \"acc_norm\": 0.4517601043024772,\n\
\ \"acc_norm_stderr\": 0.012710662233660247\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681393,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681393\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.016850961061720113,\n \"mc2\": 0.5314305414143765,\n\
\ \"mc2_stderr\": 0.015039173098592665\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090257\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48142532221379836,\n \
\ \"acc_stderr\": 0.013762977910317584\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/dolphin-2.2.1-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|arc:challenge|25_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|gsm8k|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hellaswag|10_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-00-28.671767.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-04T12-00-28.671767.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- '**/details_harness|winogrande|5_2024-01-04T12-00-28.671767.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-04T12-00-28.671767.parquet'
- config_name: results
data_files:
- split: 2024_01_04T12_00_28.671767
path:
- results_2024-01-04T12-00-28.671767.parquet
- split: latest
path:
- results_2024-01-04T12-00-28.671767.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.2.1-mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.2.1-mistral-7b](https://huggingface.co/cognitivecomputations/dolphin-2.2.1-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2.1-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-04T12:00:28.671767](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.2.1-mistral-7b/blob/main/results_2024-01-04T12-00-28.671767.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6314773728091061,
"acc_stderr": 0.032247020008011884,
"acc_norm": 0.6351477685860653,
"acc_norm_stderr": 0.03288962606766026,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5314305414143765,
"mc2_stderr": 0.015039173098592665
},
"harness|arc:challenge|25": {
"acc": 0.60580204778157,
"acc_stderr": 0.014280522667467323,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.01409099561816848
},
"harness|hellaswag|10": {
"acc": 0.6432981477793268,
"acc_stderr": 0.004780467270911771,
"acc_norm": 0.8379804819757021,
"acc_norm_stderr": 0.00367715668784884
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340354,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340354
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377561,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377561
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098823,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098823
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4517601043024772,
"acc_stderr": 0.012710662233660247,
"acc_norm": 0.4517601043024772,
"acc_norm_stderr": 0.012710662233660247
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681393,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681393
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.016850961061720113,
"mc2": 0.5314305414143765,
"mc2_stderr": 0.015039173098592665
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090257
},
"harness|gsm8k|5": {
"acc": 0.48142532221379836,
"acc_stderr": 0.013762977910317584
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Yhyu13/glaive-function-calling-v2-llama-factory-convert | ---
license: apache-2.0
---
This is a converted dataset for https://huggingface.co/datasets/glaiveai/glaive-function-calling-v2 that allows sft in https://github.com/hiyouga/LLaMA-Factory for function calling fine tuning.
You need to add the following to the datasets.json file, and changed the `file_name` to your local path.
```
"glaive-function-calling-v2": {
"file_name": "./glaive-function-calling-v2/simple-function-calling-v2_converted.json",
"columns": {
"prompt": "instruction",
"query": "input",
"response": "output",
"history": "history"
}
}
```
There is also a `simple-function-calling-v2_converted.json` that trimmed to the first 1,000 samples in the originial dataset which is about 1% in size. |
liuyanchen1015/MULTI_VALUE_mnli_our_we | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 55263
num_examples: 206
- name: dev_mismatched
num_bytes: 118172
num_examples: 489
- name: test_matched
num_bytes: 49344
num_examples: 189
- name: test_mismatched
num_bytes: 115265
num_examples: 503
- name: train
num_bytes: 2270590
num_examples: 8598
download_size: 1472058
dataset_size: 2608634
---
# Dataset Card for "MULTI_VALUE_mnli_our_we"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RustamovPY/books_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: voice
dtype: string
- name: text
dtype: string
- name: speaker
dtype: string
splits:
- name: train
num_bytes: 168
num_examples: 3
download_size: 1810
dataset_size: 168
---
# Dataset Card for "books_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dahoas/fno-flowers | ---
dataset_info:
features:
- name: images
sequence:
sequence:
sequence: uint8
splits:
- name: train
num_bytes: 26771456
num_examples: 2048
download_size: 25312381
dataset_size: 26771456
---
# Dataset Card for "fno-flowers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jxu9001/tagged_addresses_v4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: tokens
sequence: string
- name: tags
sequence: int64
splits:
- name: train
num_bytes: 10944348
num_examples: 105594
- name: validation
num_bytes: 1363481
num_examples: 13199
- name: test
num_bytes: 1370992
num_examples: 13200
download_size: 3795824
dataset_size: 13678821
---
# Dataset Card for "tagged_addresses_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TurkuNLP/Suomi24-toxicity-annotated | ---
license: cc-by-sa-4.0
task_categories:
- text-classification
language:
- fi
tags:
- toxicity
size_categories:
- 1K<n<10K
---
### Suomi-24-toxicity-annotated
This dataset includes comments from Suomi24 sampled using predictions from a toxicity classifier. The comments were taken in intervals for each label. The process of sampling emphasized difficult borderline cases. 500 comments were sampled for each label.
The annotation process used the labels from Perspective, used e.g. for `TurkuNLP/wikipedia-toxicity-data-fi`.
Instead of multi-label, we annotated each comment only for one label, although a couple comments appear in two labels.
Process of annotation included initial annotation of 100-200 comments followed by a discussion and final annotations. Raw data can be found from [here](https://github.com/TurkuNLP/toxicity-classifier/tree/main/annotations/raw_annotations).
Examples that made it to the dataset are ones that had unanimous agreement or were resolved through discussion.
### Citing
To cite this dataset use the following bibtex.
```
@inproceedings{eskelinen-etal-2023-toxicity,
title = "Toxicity Detection in {F}innish Using Machine Translation",
author = "Eskelinen, Anni and
Silvala, Laura and
Ginter, Filip and
Pyysalo, Sampo and
Laippala, Veronika",
booktitle = "Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)",
month = may,
year = "2023",
address = "T{\'o}rshavn, Faroe Islands",
publisher = "University of Tartu Library",
url = "https://aclanthology.org/2023.nodalida-1.68",
pages = "685--697",
abstract = "Due to the popularity of social media platforms and the sheer amount of user-generated content online, the automatic detection of toxic language has become crucial in the creation of a friendly and safe digital space. Previous work has been mostly focusing on English leaving many lower-resource languages behind. In this paper, we present novel resources for toxicity detection in Finnish by introducing two new datasets, a machine translated toxicity dataset for Finnish based on the widely used English Jigsaw dataset and a smaller test set of Suomi24 discussion forum comments originally written in Finnish and manually annotated following the definitions of the labels that were used to annotate the Jigsaw dataset. We show that machine translating the training data to Finnish provides better toxicity detection results than using the original English training data and zero-shot cross-lingual transfer with XLM-R, even with our newly annotated dataset from Suomi24.",
}
```
## Label definitions taken from Perspective API
THREAT: Describes an intention to inflict pain, injury, or violence against an individual or group.
THREATENING: Language that is threatening or encouraging violence or harm, including self-harm.
PROFANITY: Swear words, curse words, or other obscene or profane language.
INSULT: Insulting, inflammatory, or negative comment towards a person or a group of people. Such comments are not necessarily identity specific.
IDENTITY ATTACK: Negative or hateful comments targeting someone because of their identity.
TOXICITY: A rude, disrespectful, or unreasonable comment that is likely to make people leave a discussion.
SEVERE TOXICITY: A very hateful, aggressive, disrespectful comment or otherwise very likely to make a user leave a discussion or give up on sharing their perspective. This attribute is much less sensitive to more mild forms of toxicity, such as comments that include positive uses of curse words.
## Guidelines used for annotation:
### Obscene
swearwords, including mild expletives and misspelled, masked, or other variations
sexually explicit words/terminology that are not topically or contextually appropriate
### Threat
suicidal or self-harm comments, incitement to violence or self-harm, hypothetical situations and wishing harm to somebody
comments that are very unlikely to happen if not marked clearly as sarcasm
only threats towards people are annotated as threat
threats made by somebody else other than the writer NOT included
counterfactuals statements NOT included <!--- as in "if I was there I would have..." --->
### Insult
terms that are insulting towards groups of people (also in identity attack)
insults against political groups, e.g. "vitun demari/suvakki/persu" -> "fucking liberal/conservative etc." <!--- I made this decision here.. --->
negative insulting comments towards oneself, things other than people and hypothetical situations NOT included
<!--- PROBLEM: use of racist or rapist if true, target not clear --->
### Identity attack
comments that have no negative language but are still clearly negative
negative statements towards political groups or groups that nobody self-identifies with are NOT included (unless an insult)
### Toxicity
unreasonably expressed negative comments regardless of the target present and whether the target is known or not
mild or humoristic swearwords are NOT included
positive or neutral sexually explicit comments are NOT included
### Severe toxicity
comments that include only sexually explicit content
only one severely toxic element is needed to have this label and a comment is severely toxic even if the comment contains substantive content
target does not need to be present nor does the target matter
## Inter-annotator agreement:
| Label | Initial (unanimous) | After discussion (unanimous) | Initial (at least 2/3) | After discussion (at least 2/3) |
|------ | ------------------- | ---------------------------- | ---------------------- | ------------------------------- |
| identity attack | 54,5 % | 66,6 % | 92 % | 93,6 % |
| insult | 47,5 % | 49,6 % | 94,5 % | 95,6 % |
| severe toxicity | 63 % | 66 % | 92 % | 96,6 % |
| threat | 82 % | 80,3 % | 98 % | 97,3 % |
| toxicity | 58 % | 54 % | 93 % | 89,6 % |
| obscene | 69 % | 62 % | 97 % | 96 % |
## Evaluation results
Evaluation results from using `TurkuNLP/bert-large-finnish-cased-toxicity`.
| Label | Precision | Recall | F1 |
|------ | ------------------- | ---------------------------- | ---------------------- |
| identity attack | 73,2 | 32 | 44,6 |
| insult | 59,4 | 646,8 | 52,4 |
| severe toxicity | 12 | 28,6 | 16,9 |
| threat | 32,4 | 28,6 | 30,4 |
| toxicity | 60,4 | 79,2 | 68,5 |
| obscene | 64,5 | 82,4 | 72,3 |
| OVERALL | 57,4 | 58,9 | 51,1 |
| OVERALL weighted by original sample counts | 55,5 | 65,5 | 60,1 |
## Licensing Information
Contents of this repository are distributed under the
[Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0)](https://creativecommons.org/licenses/by-sa/4.0/).
Copyright of the dataset contents belongs to the original copyright holders. |
gustamatos/test_genome | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 29555981
num_examples: 30
- name: test
num_bytes: 6004459
num_examples: 3
download_size: 16043250
dataset_size: 35560440
---
# Dataset Card for "test_genome"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/dolly_prompt_ru | ---
dataset_info:
features:
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 23359298
num_examples: 15950
download_size: 0
dataset_size: 23359298
---
# Dataset Card for "dolly_prompt_ru"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeremygf/domains-alpha | ---
tags:
- web
- domain names
- text
size_categories:
- 100M<n<1B
---
A list of all registered .com domain names composed of only ASCII characters 97 to 122, as of January 2024. |
Obscure-Entropy/Alzheimer-MRI | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Mild_Demented
'1': Moderate_Demented
'2': Non_Demented
'3': Very_Mild_Demented
splits:
- name: train
num_bytes: 18038518.080000002
num_examples: 4096
- name: valid
num_bytes: 4509856.847999999
num_examples: 1024
- name: test
num_bytes: 5641603.64
num_examples: 1280
- name: aug
num_bytes: 120366196.0
num_examples: 4096
download_size: 120349645
dataset_size: 148556174.56800002
---
|
saitsharipov/ddpm-butterflies-128 | ---
license: unknown
---
|
acloudfan/wikismall | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2545110.078371501
num_examples: 8842
- name: validation
num_bytes: 282949.9216284987
num_examples: 983
download_size: 1538446
dataset_size: 2828060.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
mohammadhia/diffusers_animate_character | ---
dataset_info:
features:
- name: input_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
splits:
- name: train
num_bytes: 29092003.0
num_examples: 20
download_size: 29095136
dataset_size: 29092003.0
---
# Dataset Card for "diffusers_animate_character"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_YKM12__Mistral-7B-summ-privatev1 | ---
pretty_name: Evaluation run of YKM12/Mistral-7B-summ-privatev1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [YKM12/Mistral-7B-summ-privatev1](https://huggingface.co/YKM12/Mistral-7B-summ-privatev1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_YKM12__Mistral-7B-summ-privatev1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T16:51:47.124175](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM12__Mistral-7B-summ-privatev1/blob/main/results_2024-02-01T16-51-47.124175.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6561198532911652,\n\
\ \"acc_stderr\": 0.03198692258775897,\n \"acc_norm\": 0.65546725426879,\n\
\ \"acc_norm_stderr\": 0.03265800632201962,\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7188545613118653,\n\
\ \"mc2_stderr\": 0.01474731238488106\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.013273077865907588,\n\
\ \"acc_norm\": 0.7414675767918089,\n \"acc_norm_stderr\": 0.012794553754288694\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.716391157140012,\n\
\ \"acc_stderr\": 0.004498280244494495,\n \"acc_norm\": 0.8884684325831508,\n\
\ \"acc_norm_stderr\": 0.0031414591751392712\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.674074074074074,\n\
\ \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.674074074074074,\n\
\ \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695255,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695255\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5158730158730159,\n\
\ \"acc_stderr\": 0.044698818540726076,\n \"acc_norm\": 0.5158730158730159,\n\
\ \"acc_norm_stderr\": 0.044698818540726076\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.022755204959542946,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.022755204959542946\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042107,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042107\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922435,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922435\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000328,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000328\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616914,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5826193390452876,\n\
\ \"mc1_stderr\": 0.017262891063272164,\n \"mc2\": 0.7188545613118653,\n\
\ \"mc2_stderr\": 0.01474731238488106\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250676\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7020470053070508,\n \
\ \"acc_stderr\": 0.01259793223291453\n }\n}\n```"
repo_url: https://huggingface.co/YKM12/Mistral-7B-summ-privatev1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|arc:challenge|25_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|gsm8k|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hellaswag|10_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-51-47.124175.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T16-51-47.124175.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- '**/details_harness|winogrande|5_2024-02-01T16-51-47.124175.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T16-51-47.124175.parquet'
- config_name: results
data_files:
- split: 2024_02_01T16_51_47.124175
path:
- results_2024-02-01T16-51-47.124175.parquet
- split: latest
path:
- results_2024-02-01T16-51-47.124175.parquet
---
# Dataset Card for Evaluation run of YKM12/Mistral-7B-summ-privatev1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [YKM12/Mistral-7B-summ-privatev1](https://huggingface.co/YKM12/Mistral-7B-summ-privatev1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_YKM12__Mistral-7B-summ-privatev1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T16:51:47.124175](https://huggingface.co/datasets/open-llm-leaderboard/details_YKM12__Mistral-7B-summ-privatev1/blob/main/results_2024-02-01T16-51-47.124175.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6561198532911652,
"acc_stderr": 0.03198692258775897,
"acc_norm": 0.65546725426879,
"acc_norm_stderr": 0.03265800632201962,
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7188545613118653,
"mc2_stderr": 0.01474731238488106
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.013273077865907588,
"acc_norm": 0.7414675767918089,
"acc_norm_stderr": 0.012794553754288694
},
"harness|hellaswag|10": {
"acc": 0.716391157140012,
"acc_stderr": 0.004498280244494495,
"acc_norm": 0.8884684325831508,
"acc_norm_stderr": 0.0031414591751392712
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.674074074074074,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.674074074074074,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695255,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695255
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5158730158730159,
"acc_stderr": 0.044698818540726076,
"acc_norm": 0.5158730158730159,
"acc_norm_stderr": 0.044698818540726076
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.022755204959542946,
"acc_norm": 0.8,
"acc_norm_stderr": 0.022755204959542946
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268584,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042107,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042107
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922435,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922435
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000328,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000328
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616914,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5826193390452876,
"mc1_stderr": 0.017262891063272164,
"mc2": 0.7188545613118653,
"mc2_stderr": 0.01474731238488106
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250676
},
"harness|gsm8k|5": {
"acc": 0.7020470053070508,
"acc_stderr": 0.01259793223291453
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Mohammed-Altaf/medical-instruction-120k | ---
license: mit
language:
- en
tags:
- medical
pretty_name: python
size_categories:
- 100K<n<1M
---
# What is the Dataset About?🤷🏼♂️
---
The dataset is useful for training a Generative Language Model for the Medical application and instruction purposes, the dataset consists of various thoughs proposed by the people [**mentioned as the Human** ] and there responses including Medical Terminologies not limited to but including names of the drugs, prescriptions, yogic exercise suggessions, breathing exercise suggessions and few natural home made prescriptions.
# How the Dataset was made?😅
---
I have used all the available opensource datasets and combined them into a single datsource for training, which is completely opensourced and somewhat reliable.
* There is smaller version of this datset here 👉🏼 [Link](https://huggingface.co/datasets/Mohammed-Altaf/medical-instruction-100k)
## Example Training Scripts:
* Qlora Fine Tuning - |
arieg/bw_spec_cls_4_08_noise_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '694'
'1': '695'
'2': '714'
'3': '715'
splits:
- name: train
num_bytes: 44590621.0
num_examples: 800
- name: test
num_bytes: 1109208.0
num_examples: 20
download_size: 22473093
dataset_size: 45699829.0
---
# Dataset Card for "bw_spec_cls_4_08_noise_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kaina99/FNAF9 | ---
license: openrail
---
|
NPCProgrammer/BERT_tweet_tuned | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': non_irony
'1': irony
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 9085595
num_examples: 2862
- name: test
num_bytes: 2493753
num_examples: 784
- name: validation
num_bytes: 3031237
num_examples: 955
download_size: 573289
dataset_size: 14610585
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
carnival13/massive_5_lang_DA_tokenized | ---
dataset_info:
features:
- name: pass_label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 424287645
num_examples: 552890
download_size: 127805722
dataset_size: 424287645
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "massive_5_lang_DA_tokenized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Eunju2834/img_captioning_oilcanvas_style | ---
task_categories:
- text-generation
tags:
- art
size_categories:
- 1K<n<10K
--- |
open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf | ---
pretty_name: Evaluation run of heegyu/WizardVicuna2-13b-hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [heegyu/WizardVicuna2-13b-hf](https://huggingface.co/heegyu/WizardVicuna2-13b-hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T20:35:02.988920](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf/blob/main/results_2023-10-23T20-35-02.988920.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17806208053691275,\n\
\ \"em_stderr\": 0.003917823631096753,\n \"f1\": 0.23031459731543547,\n\
\ \"f1_stderr\": 0.003944169111986955,\n \"acc\": 0.4045526704895304,\n\
\ \"acc_stderr\": 0.009815196819519213\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.17806208053691275,\n \"em_stderr\": 0.003917823631096753,\n\
\ \"f1\": 0.23031459731543547,\n \"f1_stderr\": 0.003944169111986955\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07429871114480667,\n \
\ \"acc_stderr\": 0.007223844172845566\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.01240654946619286\n\
\ }\n}\n```"
repo_url: https://huggingface.co/heegyu/WizardVicuna2-13b-hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T20_35_02.988920
path:
- '**/details_harness|drop|3_2023-10-23T20-35-02.988920.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T20-35-02.988920.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T20_35_02.988920
path:
- '**/details_harness|gsm8k|5_2023-10-23T20-35-02.988920.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T20-35-02.988920.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:23:39.656390.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-09T15:23:39.656390.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T20_35_02.988920
path:
- '**/details_harness|winogrande|5_2023-10-23T20-35-02.988920.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T20-35-02.988920.parquet'
- config_name: results
data_files:
- split: 2023_08_09T15_23_39.656390
path:
- results_2023-08-09T15:23:39.656390.parquet
- split: 2023_10_23T20_35_02.988920
path:
- results_2023-10-23T20-35-02.988920.parquet
- split: latest
path:
- results_2023-10-23T20-35-02.988920.parquet
---
# Dataset Card for Evaluation run of heegyu/WizardVicuna2-13b-hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/heegyu/WizardVicuna2-13b-hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [heegyu/WizardVicuna2-13b-hf](https://huggingface.co/heegyu/WizardVicuna2-13b-hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T20:35:02.988920](https://huggingface.co/datasets/open-llm-leaderboard/details_heegyu__WizardVicuna2-13b-hf/blob/main/results_2023-10-23T20-35-02.988920.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17806208053691275,
"em_stderr": 0.003917823631096753,
"f1": 0.23031459731543547,
"f1_stderr": 0.003944169111986955,
"acc": 0.4045526704895304,
"acc_stderr": 0.009815196819519213
},
"harness|drop|3": {
"em": 0.17806208053691275,
"em_stderr": 0.003917823631096753,
"f1": 0.23031459731543547,
"f1_stderr": 0.003944169111986955
},
"harness|gsm8k|5": {
"acc": 0.07429871114480667,
"acc_stderr": 0.007223844172845566
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.01240654946619286
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Codec-SUPERB/gtzan_music_speech_synth | ---
configs:
- config_name: default
data_files:
- split: original
path: data/original-*
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k_12bps
path: data/encodec_24k_12bps-*
- split: encodec_24k_1_5bps
path: data/encodec_24k_1_5bps-*
- split: encodec_24k_24bps
path: data/encodec_24k_24bps-*
- split: encodec_24k_3bps
path: data/encodec_24k_3bps-*
- split: encodec_24k_6bps
path: data/encodec_24k_6bps-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: id
dtype: string
splits:
- name: original
num_bytes: 368654056.0
num_examples: 128
- name: academicodec_hifi_16k_320d
num_bytes: 122894168.0
num_examples: 128
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 122894168.0
num_examples: 128
- name: academicodec_hifi_24k_320d
num_bytes: 184334168.0
num_examples: 128
- name: audiodec_24k_320d
num_bytes: 184334184.0
num_examples: 128
- name: dac_16k
num_bytes: 122894168.0
num_examples: 128
- name: dac_24k
num_bytes: 184334168.0
num_examples: 128
- name: dac_44k
num_bytes: 338702168.0
num_examples: 128
- name: encodec_24k_12bps
num_bytes: 184334168.0
num_examples: 128
- name: encodec_24k_1_5bps
num_bytes: 184334168.0
num_examples: 128
- name: encodec_24k_24bps
num_bytes: 184334168.0
num_examples: 128
- name: encodec_24k_3bps
num_bytes: 184334168.0
num_examples: 128
- name: encodec_24k_6bps
num_bytes: 184334168.0
num_examples: 128
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 122894168.0
num_examples: 128
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 122894168.0
num_examples: 128
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 122894168.0
num_examples: 128
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 122894168.0
num_examples: 128
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 122894168.0
num_examples: 128
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 122894168.0
num_examples: 128
- name: speech_tokenizer_16k
num_bytes: 122894168.0
num_examples: 128
download_size: 3409069084
dataset_size: 3410971264.0
---
# Dataset Card for "gtzan_music_speech_synth"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_rishiraj__oswald-7b | ---
pretty_name: Evaluation run of rishiraj/oswald-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rishiraj/oswald-7b](https://huggingface.co/rishiraj/oswald-7b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rishiraj__oswald-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-11T08:51:34.161186](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__oswald-7b/blob/main/results_2024-01-11T08-51-34.161186.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6563844878215765,\n\
\ \"acc_stderr\": 0.03172096744574799,\n \"acc_norm\": 0.6569381429523545,\n\
\ \"acc_norm_stderr\": 0.03236913498893308,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5407006602948565,\n\
\ \"mc2_stderr\": 0.015292352537910794\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6356655290102389,\n \"acc_stderr\": 0.014063260279882419,\n\
\ \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6581358295160327,\n\
\ \"acc_stderr\": 0.004733649274814508,\n \"acc_norm\": 0.851822346146186,\n\
\ \"acc_norm_stderr\": 0.003545499169558053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253837,\n \"\
acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253837\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n\
\ \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n\
\ \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.0315841532404771,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.0315841532404771\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267042,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267042\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223144,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223144\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6871794871794872,\n \"acc_stderr\": 0.023507579020645358,\n\
\ \"acc_norm\": 0.6871794871794872,\n \"acc_norm_stderr\": 0.023507579020645358\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028075,\n \"\
acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028075\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579654,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579654\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699813,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699813\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.01987565502786744,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.01987565502786744\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n\
\ \"acc_stderr\": 0.013306478243066302,\n \"acc_norm\": 0.8339719029374202,\n\
\ \"acc_norm_stderr\": 0.013306478243066302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n\
\ \"acc_stderr\": 0.01475690648326066,\n \"acc_norm\": 0.264804469273743,\n\
\ \"acc_norm_stderr\": 0.01475690648326066\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888135,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888135\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4869621903520209,\n \"acc_stderr\": 0.012765893883835332,\n\
\ \"acc_norm\": 0.4869621903520209,\n \"acc_norm_stderr\": 0.012765893883835332\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.7279411764705882,\n \"acc_stderr\": 0.027033041151681456,\n \"\
acc_norm\": 0.7279411764705882,\n \"acc_norm_stderr\": 0.027033041151681456\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528183,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528183\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.01693237055757063,\n \"mc2\": 0.5407006602948565,\n\
\ \"mc2_stderr\": 0.015292352537910794\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510429\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \
\ \"acc_stderr\": 0.012705685723131707\n }\n}\n```"
repo_url: https://huggingface.co/rishiraj/oswald-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|arc:challenge|25_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|gsm8k|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hellaswag|10_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-11T08-51-34.161186.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- '**/details_harness|winogrande|5_2024-01-11T08-51-34.161186.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-11T08-51-34.161186.parquet'
- config_name: results
data_files:
- split: 2024_01_11T08_51_34.161186
path:
- results_2024-01-11T08-51-34.161186.parquet
- split: latest
path:
- results_2024-01-11T08-51-34.161186.parquet
---
# Dataset Card for Evaluation run of rishiraj/oswald-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rishiraj/oswald-7b](https://huggingface.co/rishiraj/oswald-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rishiraj__oswald-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-11T08:51:34.161186](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__oswald-7b/blob/main/results_2024-01-11T08-51-34.161186.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6563844878215765,
"acc_stderr": 0.03172096744574799,
"acc_norm": 0.6569381429523545,
"acc_norm_stderr": 0.03236913498893308,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5407006602948565,
"mc2_stderr": 0.015292352537910794
},
"harness|arc:challenge|25": {
"acc": 0.6356655290102389,
"acc_stderr": 0.014063260279882419,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6581358295160327,
"acc_stderr": 0.004733649274814508,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.003545499169558053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253837,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253837
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.0315841532404771,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.0315841532404771
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267042,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267042
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223144,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223144
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6871794871794872,
"acc_stderr": 0.023507579020645358,
"acc_norm": 0.6871794871794872,
"acc_norm_stderr": 0.023507579020645358
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028075,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028075
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579654,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579654
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699813,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699813
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.01987565502786744,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.01987565502786744
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.013306478243066302,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.013306478243066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.01475690648326066,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.01475690648326066
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888135,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888135
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4869621903520209,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.4869621903520209,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7279411764705882,
"acc_stderr": 0.027033041151681456,
"acc_norm": 0.7279411764705882,
"acc_norm_stderr": 0.027033041151681456
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528183,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528183
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.01693237055757063,
"mc2": 0.5407006602948565,
"mc2_stderr": 0.015292352537910794
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510429
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-one-sec-cv12/chunk_8 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1118931356
num_examples: 219743
download_size: 1138834704
dataset_size: 1118931356
---
# Dataset Card for "chunk_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangshuoming/math_23k_double_value_init | ---
dataset_info:
features:
- name: text
struct:
- name: asm
dtype: string
- name: c
dtype: string
- name: driver
dtype: string
splits:
- name: train
num_bytes: 22718470
num_examples: 21104
download_size: 0
dataset_size: 22718470
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "math_23k_double_value_init"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EinfachOlder/autotrain-data-gtzs-bj3r-fz0k | ---
dataset_info:
features:
- name: content
dtype: string
- name: autotrain_text
dtype: string
- name: autotrain_label
dtype: string
splits:
- name: train
num_bytes: 12258
num_examples: 6
- name: validation
num_bytes: 3200
num_examples: 2
download_size: 33851
dataset_size: 15458
---
# Dataset Card for "autotrain-data-gtzs-bj3r-fz0k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umuthopeyildirim/svgen-500k | ---
license: cc
task_categories:
- text-generation
language:
- en
tags:
- SVG
- vector
pretty_name: SVGen Dataset
size_categories:
- 100K<n<1M
---
# SVGen Vector Images Dataset
## Overview
SVGen is a comprehensive dataset containing 300,000 SVG vector codes from a diverse set of sources including SVG-Repo, Noto Emoji, and InstructSVG. The dataset aims to provide a wide range of SVG files suitable for various applications including web development, design, and machine learning research.
## Data Fields
- **input**: The name or label of the SVG item
- **output**: SVG code containing the vector representation
- **description**: Brief description of the SVG item
- **source**: The original source or collection of the SVG
- **license**: Licensing terms for using the SVG
## Data Sources
- [SVG-Repo](https://www.svgrepo.com/)
- [Noto Emoji](https://huggingface.co/datasets/darknoon/noto-emoji-vector-512-svg)
- [InstructSVG](https://huggingface.co/datasets/uwunion/instruct_svg)
## Usage
The dataset is particularly useful for tasks such as icon classification, style transfer, image-to-vector translation, and much more. It serves as a rich resource for machine learning models that require high-quality SVG data.
## Help Wanted
I wanted to use BILP to generate `description`'s for each SVG, but It's not working well. If you have any ideas, please let me know. Here is the [Github](https://github.com/umuthopeyildirim/SVGenDataset) and it also contains Colab notebook links.
## License
The dataset incorporates SVG files with varying licenses. Users are advised to consult the `license` field of each record for specific usage rights.
## Contribution Guidelines
Contributions are welcome! If you find any issues or would like to add more SVGs to the dataset, please submit a pull request or open an issue in the repository.
## Acknowledgements
A huge thanks to SVGRepo, Noto Emoji, and InstructSVG for providing the SVG files that make up this dataset.
For more details and to download the dataset, visit the project repository.
|
open-llm-leaderboard/details_ab24g21__llama-2-new | ---
pretty_name: Evaluation run of ab24g21/llama-2-new
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ab24g21/llama-2-new](https://huggingface.co/ab24g21/llama-2-new) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ab24g21__llama-2-new\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-29T19:27:59.516823](https://huggingface.co/datasets/open-llm-leaderboard/details_ab24g21__llama-2-new/blob/main/results_2024-03-29T19-27-59.516823.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5466189870605842,\n\
\ \"acc_stderr\": 0.033732797227792224,\n \"acc_norm\": 0.5511803803357496,\n\
\ \"acc_norm_stderr\": 0.03444062705895664,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44584650680019916,\n\
\ \"mc2_stderr\": 0.01533217759011234\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.01454451988063383,\n\
\ \"acc_norm\": 0.5870307167235495,\n \"acc_norm_stderr\": 0.014388344935398326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6206930890260904,\n\
\ \"acc_stderr\": 0.004842229276915339,\n \"acc_norm\": 0.8153754232224656,\n\
\ \"acc_norm_stderr\": 0.0038719976167342694\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39148936170212767,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.39148936170212767,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596437,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596437\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.030031147977641538,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.030031147977641538\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622841,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622841\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.744954128440367,\n \"acc_stderr\": 0.018688500856535832,\n \"\
acc_norm\": 0.744954128440367,\n \"acc_norm_stderr\": 0.018688500856535832\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"\
acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923403,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923403\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890484,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890484\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7484035759897829,\n\
\ \"acc_stderr\": 0.015517322365529641,\n \"acc_norm\": 0.7484035759897829,\n\
\ \"acc_norm_stderr\": 0.015517322365529641\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n\
\ \"acc_stderr\": 0.015268677317602283,\n \"acc_norm\": 0.29608938547486036,\n\
\ \"acc_norm_stderr\": 0.015268677317602283\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283693,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283693\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n\
\ \"acc_stderr\": 0.027690337536485376,\n \"acc_norm\": 0.6109324758842444,\n\
\ \"acc_norm_stderr\": 0.027690337536485376\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327695,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327695\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115886,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115886\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3878748370273794,\n\
\ \"acc_stderr\": 0.01244499830967562,\n \"acc_norm\": 0.3878748370273794,\n\
\ \"acc_norm_stderr\": 0.01244499830967562\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428195,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307296,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307296\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6326530612244898,\n \"acc_stderr\": 0.030862144921087558,\n\
\ \"acc_norm\": 0.6326530612244898,\n \"acc_norm_stderr\": 0.030862144921087558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.030965903123573012,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.030965903123573012\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.44584650680019916,\n\
\ \"mc2_stderr\": 0.01533217759011234\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.760852407261247,\n \"acc_stderr\": 0.011988541844843909\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2971948445792267,\n \
\ \"acc_stderr\": 0.012588685966624187\n }\n}\n```"
repo_url: https://huggingface.co/ab24g21/llama-2-new
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|arc:challenge|25_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|gsm8k|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hellaswag|10_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-27-59.516823.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-29T19-27-59.516823.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- '**/details_harness|winogrande|5_2024-03-29T19-27-59.516823.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-29T19-27-59.516823.parquet'
- config_name: results
data_files:
- split: 2024_03_29T19_27_59.516823
path:
- results_2024-03-29T19-27-59.516823.parquet
- split: latest
path:
- results_2024-03-29T19-27-59.516823.parquet
---
# Dataset Card for Evaluation run of ab24g21/llama-2-new
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ab24g21/llama-2-new](https://huggingface.co/ab24g21/llama-2-new) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ab24g21__llama-2-new",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-29T19:27:59.516823](https://huggingface.co/datasets/open-llm-leaderboard/details_ab24g21__llama-2-new/blob/main/results_2024-03-29T19-27-59.516823.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5466189870605842,
"acc_stderr": 0.033732797227792224,
"acc_norm": 0.5511803803357496,
"acc_norm_stderr": 0.03444062705895664,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.44584650680019916,
"mc2_stderr": 0.01533217759011234
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.01454451988063383,
"acc_norm": 0.5870307167235495,
"acc_norm_stderr": 0.014388344935398326
},
"harness|hellaswag|10": {
"acc": 0.6206930890260904,
"acc_stderr": 0.004842229276915339,
"acc_norm": 0.8153754232224656,
"acc_norm_stderr": 0.0038719976167342694
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39148936170212767,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.39148936170212767,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596437,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.030031147977641538,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.030031147977641538
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622841,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622841
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.744954128440367,
"acc_stderr": 0.018688500856535832,
"acc_norm": 0.744954128440367,
"acc_norm_stderr": 0.018688500856535832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923403,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923403
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890484,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890484
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7484035759897829,
"acc_stderr": 0.015517322365529641,
"acc_norm": 0.7484035759897829,
"acc_norm_stderr": 0.015517322365529641
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.015268677317602283,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.015268677317602283
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283693,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283693
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485376,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485376
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327695,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327695
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115886,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115886
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3878748370273794,
"acc_stderr": 0.01244499830967562,
"acc_norm": 0.3878748370273794,
"acc_norm_stderr": 0.01244499830967562
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428195,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307296,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307296
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6326530612244898,
"acc_stderr": 0.030862144921087558,
"acc_norm": 0.6326530612244898,
"acc_norm_stderr": 0.030862144921087558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573012,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573012
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.44584650680019916,
"mc2_stderr": 0.01533217759011234
},
"harness|winogrande|5": {
"acc": 0.760852407261247,
"acc_stderr": 0.011988541844843909
},
"harness|gsm8k|5": {
"acc": 0.2971948445792267,
"acc_stderr": 0.012588685966624187
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_ldahee__SLAL-0.1 | ---
pretty_name: Evaluation run of ldahee/SLAL-0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ldahee/SLAL-0.1](https://huggingface.co/ldahee/SLAL-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ldahee__SLAL-0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-01T03:39:34.987881](https://huggingface.co/datasets/open-llm-leaderboard/details_ldahee__SLAL-0.1/blob/main/results_2024-03-01T03-39-34.987881.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6598872301882013,\n\
\ \"acc_stderr\": 0.031979502334376575,\n \"acc_norm\": 0.6609271308158413,\n\
\ \"acc_norm_stderr\": 0.03265360850599212,\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5421631897463146,\n\
\ \"mc2_stderr\": 0.015425726907541416\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5426621160409556,\n \"acc_stderr\": 0.014558106543924058,\n\
\ \"acc_norm\": 0.5793515358361775,\n \"acc_norm_stderr\": 0.014426211252508396\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6094403505277833,\n\
\ \"acc_stderr\": 0.004868787333436578,\n \"acc_norm\": 0.8014339772953595,\n\
\ \"acc_norm_stderr\": 0.003981052091169832\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802269,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802269\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n\
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6510638297872341,\n \"acc_stderr\": 0.031158522131357783,\n\
\ \"acc_norm\": 0.6510638297872341,\n \"acc_norm_stderr\": 0.031158522131357783\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5052910052910053,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.5052910052910053,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603492,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603492\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.02554565042660362,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.02554565042660362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \
\ \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.028657491285071966,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.028657491285071966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4304635761589404,\n \"acc_stderr\": 0.040428099613956346,\n \"\
acc_norm\": 0.4304635761589404,\n \"acc_norm_stderr\": 0.040428099613956346\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590167,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5462962962962963,\n \"acc_stderr\": 0.03395322726375798,\n \"\
acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.03395322726375798\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553332,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553332\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.035208939510976534,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.035208939510976534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092375,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092375\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757433,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757433\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.024723861504771703,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.024723861504771703\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713002,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713002\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303055,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303055\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n\
\ \"acc_stderr\": 0.012744149704869643,\n \"acc_norm\": 0.4680573663624511,\n\
\ \"acc_norm_stderr\": 0.012744149704869643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866767,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866767\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n\
\ \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5421631897463146,\n\
\ \"mc2_stderr\": 0.015425726907541416\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8555643251775849,\n \"acc_stderr\": 0.009879767358079243\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6315390447308568,\n \
\ \"acc_stderr\": 0.013287342651674578\n }\n}\n```"
repo_url: https://huggingface.co/ldahee/SLAL-0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|arc:challenge|25_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|gsm8k|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hellaswag|10_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-39-34.987881.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-01T03-39-34.987881.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- '**/details_harness|winogrande|5_2024-03-01T03-39-34.987881.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-01T03-39-34.987881.parquet'
- config_name: results
data_files:
- split: 2024_03_01T03_39_34.987881
path:
- results_2024-03-01T03-39-34.987881.parquet
- split: latest
path:
- results_2024-03-01T03-39-34.987881.parquet
---
# Dataset Card for Evaluation run of ldahee/SLAL-0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ldahee/SLAL-0.1](https://huggingface.co/ldahee/SLAL-0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ldahee__SLAL-0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-01T03:39:34.987881](https://huggingface.co/datasets/open-llm-leaderboard/details_ldahee__SLAL-0.1/blob/main/results_2024-03-01T03-39-34.987881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6598872301882013,
"acc_stderr": 0.031979502334376575,
"acc_norm": 0.6609271308158413,
"acc_norm_stderr": 0.03265360850599212,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5421631897463146,
"mc2_stderr": 0.015425726907541416
},
"harness|arc:challenge|25": {
"acc": 0.5426621160409556,
"acc_stderr": 0.014558106543924058,
"acc_norm": 0.5793515358361775,
"acc_norm_stderr": 0.014426211252508396
},
"harness|hellaswag|10": {
"acc": 0.6094403505277833,
"acc_stderr": 0.004868787333436578,
"acc_norm": 0.8014339772953595,
"acc_norm_stderr": 0.003981052091169832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802269,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802269
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6510638297872341,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.6510638297872341,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5052910052910053,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.5052910052910053,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603492,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603492
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.02554565042660362,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.02554565042660362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.028657491285071966,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.028657491285071966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4304635761589404,
"acc_stderr": 0.040428099613956346,
"acc_norm": 0.4304635761589404,
"acc_norm_stderr": 0.040428099613956346
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590167,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553332,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553332
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.035208939510976534,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.035208939510976534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092375,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092375
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757433,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771703,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771703
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713002,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713002
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303055,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303055
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869643,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866767,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866767
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5421631897463146,
"mc2_stderr": 0.015425726907541416
},
"harness|winogrande|5": {
"acc": 0.8555643251775849,
"acc_stderr": 0.009879767358079243
},
"harness|gsm8k|5": {
"acc": 0.6315390447308568,
"acc_stderr": 0.013287342651674578
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Korakoe/MegaInstruct | ---
task_categories:
- text-generation
tags:
- large
- instruct
- usernames
pretty_name: Mega Instruct
size_categories:
- 100K<n<1M
---
# MegaInstruct
A large instruct dataset, merging multiple into the alpaca format
### Note:
Both the gpt4all and vicuna datasets have usernames appended to them, so hopefully username aware chatbot datasets can be added on top of this! |
rdmpage/autotrain-data-page7 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: page7
## Dataset Description
This dataset has been automatically processed by AutoTrain for project page7.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<241x411 RGB PIL image>",
"target": 6
},
{
"image": "<209x293 RGB PIL image>",
"target": 1
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['blank', 'content', 'cover', 'end', 'endstart', 'plate', 'start'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 449 |
| valid | 116 |
|
HydraLM/partitioned_v3_standardized_024 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 12182515.807802783
num_examples: 22656
download_size: 12802424
dataset_size: 12182515.807802783
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_024"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anonymous347928/broden_concepts | ---
license: mit
---
|
xiwen426/tokenized_dataset_intent_3600_SampleScripts | ---
license: apache-2.0
---
|
heegyu/ko-openchat-0404-test | ---
dataset_info:
features:
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 151814905.0
num_examples: 70000
download_size: 77237824
dataset_size: 151814905.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
한국어 챗봇 학습을 위해, 여러 데이터를 가져와서 포멧을 통일 (각 데이터셋마다 처음 1만개씩 추출)
- [heegyu/glaive-function-calling-v2-ko](https://huggingface.co/datasets/heegyu/glaive-function-calling-v2-ko): 15170 items
- [heegyu/PKU-SafeRLHF-ko](https://huggingface.co/datasets/heegyu/PKU-SafeRLHF-ko): 135213 items
- [maywell/koVast](https://huggingface.co/datasets/maywell/koVast): 684579 items
- [MarkrAI/KoCommercial-Dataset](https://huggingface.co/datasets/MarkrAI/KoCommercial-Dataset): 175454 items
- [HuggingFaceH4/ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k): 207865 items
- [Open-Orca/SlimOrca-Dedup](https://huggingface.co/datasets/Open-Orca/SlimOrca-Dedup): 363491 items
- [glaiveai/glaive-code-assistant-v2](https://huggingface.co/datasets/glaiveai/glaive-code-assistant-v2): 215166 items
|
Shekswess/mistral_medquad_instruct_dataset | ---
language:
- en
size_categories:
- 10K<n<100K
task_categories:
- question-answering
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 47296307
num_examples: 16359
download_size: 17865039
dataset_size: 47296307
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- medical
---
Dataset made for instruction supervised finetuning of Mistral LLMs based on the Medquad dataset:
- Medquad dataset (https://www.kaggle.com/datasets/jpmiller/layoutlm)
## Medquad
MedQuAD is a comprehensive collection consisting of 47,457 medical question-answer pairs compiled from 12 authoritative sources within the National Institutes of Health (NIH), including domains like cancer.gov, niddk.nih.gov, GARD, and MedlinePlus Health Topics. These question-answer pairs span 37 distinct question types, covering a wide spectrum of medical subjects, including diseases, drugs, and medical procedures. The dataset features additional annotations provided in XML files, facilitating various Information Retrieval (IR) and Natural Language Processing (NLP) tasks. These annotations encompass crucial information such as question type, question focus, synonyms, Unique Identifier (CUI) from the Unified Medical Language System (UMLS), and Semantic Type. Moreover, the dataset includes categorization of question focuses into three main categories: Disease, Drug, or Other, with the exception of collections from MedlinePlus, which exclusively focus on diseases. |
Multimodal-Fatima/VQAv2_minival_validation_vprevious | ---
dataset_info:
features:
- name: question_type
dtype: string
- name: multiple_choice_answer
dtype: string
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: image
dtype: image
- name: clip_tags_ViT_L_14
sequence: string
- name: blip_caption
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: id
dtype: int64
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_without_filtering
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_clip_ViT_L_14_blip_caption_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: caption
dtype: string
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: Attributes_ViT_L_14_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_L_14_wo_openai
sequence: string
- name: clip_tags_ViT_L_14_with_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B_with_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_wo_openai
sequence: string
- name: clip_tags_LAION_ViT_bigG_14_2B_with_openai
sequence: string
- name: Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: Attributes_LAION_ViT_bigG_14_2B_descriptors_text_davinci_003_full
sequence: string
- name: clip_tags_ViT_B_16_with_openai
sequence: string
- name: blip_caption_beam_5_Salesforce_blip2_flan_t5_xxl
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_all_patches
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large_clean
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_all_patches
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: validation
num_bytes: 10522752337.0
num_examples: 25994
download_size: 2699481376
dataset_size: 10522752337.0
---
# Dataset Card for "VQA_minival_validation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitsuki-Sakamoto/alpaca_farm-deberta-re-preference-64-nsample-2_iso_filter_gold_thr_0.5_self_70m | ---
dataset_info:
config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: index
dtype: int64
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
splits:
- name: epoch_0
num_bytes: 43220385
num_examples: 18929
- name: epoch_1
num_bytes: 43974494
num_examples: 18929
- name: epoch_2
num_bytes: 44030301
num_examples: 18929
- name: epoch_3
num_bytes: 44071124
num_examples: 18929
- name: epoch_4
num_bytes: 44079017
num_examples: 18929
- name: epoch_5
num_bytes: 44103453
num_examples: 18929
- name: epoch_6
num_bytes: 44114405
num_examples: 18929
- name: epoch_7
num_bytes: 44125274
num_examples: 18929
- name: epoch_8
num_bytes: 44134472
num_examples: 18929
- name: epoch_9
num_bytes: 44140308
num_examples: 18929
download_size: 655477139
dataset_size: 439993233
configs:
- config_name: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_2
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_3
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_70m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
---
|
aarthii/project | ---
license: openrail
---
|
Imran263/Gath_baize_mod | ---
license: mit
---
|
metamath/codeparrot-ds-raw-sm | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: path
dtype: string
- name: copies
dtype: string
- name: size
dtype: string
- name: content
dtype: string
- name: license
dtype: string
splits:
- name: train
num_bytes: 60634
num_examples: 10
- name: valid
num_bytes: 98567
num_examples: 10
download_size: 86952
dataset_size: 159201
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
---
- `transformersbook/codeparrot-train` 데이터 셋에서 Data Science 관련 코드를 추출하고 split당 10개씩만 고른 데모용 데어터셋
- A demo dataset that extracts Data Science related code from the `transformersbook/codeparrot-train` dataset and picks only 10 pieces per split.
|
eanderson/squad_v1_nb | ---
license: mit
---
|
AdapterOcean/med_alpaca_standardized_cluster_40_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 17787714
num_examples: 11009
download_size: 9173963
dataset_size: 17787714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_40_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_adamo1139__LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702 | ---
pretty_name: Evaluation run of adamo1139/LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [adamo1139/LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702](https://huggingface.co/adamo1139/LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_adamo1139__LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T19:47:40.194212](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702/blob/main/results_2024-02-17T19-47-40.194212.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.43343053522803004,\n\
\ \"acc_stderr\": 0.033978126605441465,\n \"acc_norm\": 0.43829292837370853,\n\
\ \"acc_norm_stderr\": 0.034771202687737465,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4418799998573623,\n\
\ \"mc2_stderr\": 0.014628165296907778\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4761092150170648,\n \"acc_stderr\": 0.014594701798071654,\n\
\ \"acc_norm\": 0.5119453924914675,\n \"acc_norm_stderr\": 0.014607220340597167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.577275443138817,\n\
\ \"acc_stderr\": 0.004929828337606981,\n \"acc_norm\": 0.770762796255726,\n\
\ \"acc_norm_stderr\": 0.0041948307161260665\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.46037735849056605,\n \"acc_stderr\": 0.030676096599389184,\n\
\ \"acc_norm\": 0.46037735849056605,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4027777777777778,\n\
\ \"acc_stderr\": 0.04101405519842425,\n \"acc_norm\": 0.4027777777777778,\n\
\ \"acc_norm_stderr\": 0.04101405519842425\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036844,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.03148955829745529,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.03148955829745529\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633345,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633345\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.0361960452412425,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.0361960452412425\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4838709677419355,\n\
\ \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.4838709677419355,\n\
\ \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.03194740072265541,\n\
\ \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.03194740072265541\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.03793713171165636,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.03793713171165636\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.494949494949495,\n \"acc_stderr\": 0.035621707606254015,\n \"\
acc_norm\": 0.494949494949495,\n \"acc_norm_stderr\": 0.035621707606254015\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5803108808290155,\n \"acc_stderr\": 0.035615873276858834,\n\
\ \"acc_norm\": 0.5803108808290155,\n \"acc_norm_stderr\": 0.035615873276858834\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3974358974358974,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.3974358974358974,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712177,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712177\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6018348623853211,\n \"acc_stderr\": 0.020987989422654268,\n \"\
acc_norm\": 0.6018348623853211,\n \"acc_norm_stderr\": 0.020987989422654268\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510927,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510927\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03460228327239172,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03460228327239172\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6160337552742616,\n \"acc_stderr\": 0.031658678064106674,\n \
\ \"acc_norm\": 0.6160337552742616,\n \"acc_norm_stderr\": 0.031658678064106674\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.515695067264574,\n\
\ \"acc_stderr\": 0.0335412657542081,\n \"acc_norm\": 0.515695067264574,\n\
\ \"acc_norm_stderr\": 0.0335412657542081\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553893,\n\
\ \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553893\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\"\
: 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4110429447852761,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.4110429447852761,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.049505043821289195,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.049505043821289195\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6196581196581197,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.6196581196581197,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6015325670498084,\n\
\ \"acc_stderr\": 0.017507438602777405,\n \"acc_norm\": 0.6015325670498084,\n\
\ \"acc_norm_stderr\": 0.017507438602777405\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5057803468208093,\n \"acc_stderr\": 0.02691729617914911,\n\
\ \"acc_norm\": 0.5057803468208093,\n \"acc_norm_stderr\": 0.02691729617914911\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961443,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961443\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.434640522875817,\n \"acc_stderr\": 0.028384256704883034,\n\
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.028384256704883034\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5498392282958199,\n\
\ \"acc_stderr\": 0.028256660723360173,\n \"acc_norm\": 0.5498392282958199,\n\
\ \"acc_norm_stderr\": 0.028256660723360173\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.027777777777777797,\n\
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.027777777777777797\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347243,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347243\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3409387222946545,\n\
\ \"acc_stderr\": 0.01210681720306721,\n \"acc_norm\": 0.3409387222946545,\n\
\ \"acc_norm_stderr\": 0.01210681720306721\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3492647058823529,\n \"acc_stderr\": 0.028959755196824876,\n\
\ \"acc_norm\": 0.3492647058823529,\n \"acc_norm_stderr\": 0.028959755196824876\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.44281045751633985,\n \"acc_stderr\": 0.020095083154577347,\n \
\ \"acc_norm\": 0.44281045751633985,\n \"acc_norm_stderr\": 0.020095083154577347\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.40408163265306124,\n \"acc_stderr\": 0.031414708025865885,\n\
\ \"acc_norm\": 0.40408163265306124,\n \"acc_norm_stderr\": 0.031414708025865885\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\
\ \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n\
\ \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6023391812865497,\n \"acc_stderr\": 0.0375363895576169,\n\
\ \"acc_norm\": 0.6023391812865497,\n \"acc_norm_stderr\": 0.0375363895576169\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184828,\n \"mc2\": 0.4418799998573623,\n\
\ \"mc2_stderr\": 0.014628165296907778\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.01261082653940468\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0887035633055345,\n \
\ \"acc_stderr\": 0.007831458737058716\n }\n}\n```"
repo_url: https://huggingface.co/adamo1139/LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-47-40.194212.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-47-40.194212.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- '**/details_harness|winogrande|5_2024-02-17T19-47-40.194212.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T19-47-40.194212.parquet'
- config_name: results
data_files:
- split: 2024_02_17T19_47_40.194212
path:
- results_2024-02-17T19-47-40.194212.parquet
- split: latest
path:
- results_2024-02-17T19-47-40.194212.parquet
---
# Dataset Card for Evaluation run of adamo1139/LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [adamo1139/LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702](https://huggingface.co/adamo1139/LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_adamo1139__LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T19:47:40.194212](https://huggingface.co/datasets/open-llm-leaderboard/details_adamo1139__LWM-7B-1M-1000000ctx-AEZAKMI-3_1-1702/blob/main/results_2024-02-17T19-47-40.194212.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.43343053522803004,
"acc_stderr": 0.033978126605441465,
"acc_norm": 0.43829292837370853,
"acc_norm_stderr": 0.034771202687737465,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4418799998573623,
"mc2_stderr": 0.014628165296907778
},
"harness|arc:challenge|25": {
"acc": 0.4761092150170648,
"acc_stderr": 0.014594701798071654,
"acc_norm": 0.5119453924914675,
"acc_norm_stderr": 0.014607220340597167
},
"harness|hellaswag|10": {
"acc": 0.577275443138817,
"acc_stderr": 0.004929828337606981,
"acc_norm": 0.770762796255726,
"acc_norm_stderr": 0.0041948307161260665
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.46037735849056605,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.46037735849056605,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.04101405519842425,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.04101405519842425
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.03148955829745529,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.03148955829745529
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633345,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633345
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.0361960452412425,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.0361960452412425
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4838709677419355,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.4838709677419355,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.03194740072265541,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.03194740072265541
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.03793713171165636,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.03793713171165636
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.494949494949495,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.494949494949495,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5803108808290155,
"acc_stderr": 0.035615873276858834,
"acc_norm": 0.5803108808290155,
"acc_norm_stderr": 0.035615873276858834
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3974358974358974,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.3974358974358974,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712177,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712177
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6018348623853211,
"acc_stderr": 0.020987989422654268,
"acc_norm": 0.6018348623853211,
"acc_norm_stderr": 0.020987989422654268
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510927,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510927
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03460228327239172,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03460228327239172
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6160337552742616,
"acc_stderr": 0.031658678064106674,
"acc_norm": 0.6160337552742616,
"acc_norm_stderr": 0.031658678064106674
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.515695067264574,
"acc_stderr": 0.0335412657542081,
"acc_norm": 0.515695067264574,
"acc_norm_stderr": 0.0335412657542081
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553893,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553893
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4110429447852761,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.4110429447852761,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.049505043821289195,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.049505043821289195
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6196581196581197,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.6196581196581197,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6015325670498084,
"acc_stderr": 0.017507438602777405,
"acc_norm": 0.6015325670498084,
"acc_norm_stderr": 0.017507438602777405
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5057803468208093,
"acc_stderr": 0.02691729617914911,
"acc_norm": 0.5057803468208093,
"acc_norm_stderr": 0.02691729617914911
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961443,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961443
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.028384256704883034,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.028384256704883034
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5498392282958199,
"acc_stderr": 0.028256660723360173,
"acc_norm": 0.5498392282958199,
"acc_norm_stderr": 0.028256660723360173
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347243,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347243
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3409387222946545,
"acc_stderr": 0.01210681720306721,
"acc_norm": 0.3409387222946545,
"acc_norm_stderr": 0.01210681720306721
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3492647058823529,
"acc_stderr": 0.028959755196824876,
"acc_norm": 0.3492647058823529,
"acc_norm_stderr": 0.028959755196824876
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.44281045751633985,
"acc_stderr": 0.020095083154577347,
"acc_norm": 0.44281045751633985,
"acc_norm_stderr": 0.020095083154577347
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.40408163265306124,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.40408163265306124,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6023391812865497,
"acc_stderr": 0.0375363895576169,
"acc_norm": 0.6023391812865497,
"acc_norm_stderr": 0.0375363895576169
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184828,
"mc2": 0.4418799998573623,
"mc2_stderr": 0.014628165296907778
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.01261082653940468
},
"harness|gsm8k|5": {
"acc": 0.0887035633055345,
"acc_stderr": 0.007831458737058716
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
deusprofano/images | ---
license: other
---
|
pardeep/youtube-vidoes-transcripts-hindi-english | ---
license: odc-by
---
**Context**
The dataset contains the Hindi and English subtitles for famous YouTube channels. This dataset was mainly created for the Hindi Language channel since the main goal was to use this dataset to build LLMs using the Hindi Language.
Data from channels in Information, Entertainment, Politics, Comedy, News, etc categories has been included in this dataset.
***Dataset Stats:***
- **58 channels**
- **103,042 total videos**
**Content**
- Video subtitles in Hindi and English
- Video metadata like duration, number of comments, likes, counts, published date
**Acknowledgements**
The source of this dataset is YouTube. The following packages were used to generate this dataset:
- [youtube-transcript-api](https://pypi.org/project/youtube-transcript-api/)
- [google-api-python-client](https://pypi.org/project/google-api-python-client/)
**Inspiration**
- Build LLMs model using Hindi
- Finetune models using Hindi for tasks like classification, summarization, translation, etc |
dwidlee/wiki-dump-ko | ---
license: cc
---
|
mateusss22/meumodelos | ---
license: openrail
---
|
ibranze/araproje_arc_en_s3 | ---
dataset_info:
features:
- name: id
dtype: string
- name: question
dtype: string
- name: choices
sequence:
- name: text
dtype: string
- name: label
dtype: string
- name: answerKey
dtype: string
splits:
- name: validation
num_bytes: 80031.0
num_examples: 250
download_size: 46971
dataset_size: 80031.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "araproje_arc_en_s3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangwang825/vox2-veri-full | ---
task_categories:
- audio-classification
tags:
- audio
- VoxCeleb
- verification
---
# VoxCeleb 2
VoxCeleb2 contains over 1 million utterances for 6,112 celebrities, extracted from videos uploaded to YouTube.
## Verification Split
| | train | validation | test |
| :---: | :---: | :---: | :---: |
| # of speakers | 5,994 | 5,994 | 118 |
| # of samples | 982,808 | 109,201 | 36,237 |
## Data Fields
- ID (string): The ID of the sample with format `<spk_id--utt_id_start_stop>`.
- duration (float64): The duration of the segment in seconds.
- wav (string): The filepath of the waveform.
- start (int64): The start index of the segment, which is (start seconds) × (sample rate).
- stop (int64): The stop index of the segment, which is (stop seconds) × (sample rate).
- spk_id (string): The ID of the speaker.
Example:
```
{
'ID': 'id09056--00112_0_89088',
'duration': 5.568,
'wav': 'id09056/U2mRgZ1tW04/00112.wav',
'start': 0,
'stop': 89088,
'spk_id': 'id09056'
}
```
## References
- https://www.robots.ox.ac.uk/~vgg/data/voxceleb/vox2.html |
Rexhaif/mintaka-qa-en | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 1047958
num_examples: 14000
- name: dev
num_bytes: 150348
num_examples: 2000
- name: test
num_bytes: 298101
num_examples: 4000
download_size: 787307
dataset_size: 1496407
---
# Dataset Card for "mintaka-qa-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mitali05/llama2-finetune-sentiment-analysis | ---
license: llama2
---
|
liuyanchen1015/MULTI_VALUE_sst2_superlative_before_matrix_head | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: train
num_bytes: 72
num_examples: 1
download_size: 1911
dataset_size: 72
---
# Dataset Card for "MULTI_VALUE_sst2_superlative_before_matrix_head"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_geography | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: fewshot_context_neg
dtype: string
splits:
- name: dev
num_bytes: 4484
num_examples: 5
- name: test
num_bytes: 710052
num_examples: 198
download_size: 89752
dataset_size: 714536
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_geography"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
huzaifahp7/ECTSum_Zero_Not_Hero | ---
license: mit
---
|
CyberHarem/locusta_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of locusta/ロクスタ/洛库斯塔 (Fate/Grand Order)
This is the dataset of locusta/ロクスタ/洛库斯塔 (Fate/Grand Order), containing 15 images and their tags.
The core tags of this character are `short_hair, purple_eyes, blue_hair, multicolored_hair, breasts, hat, green_hair, purple_headwear, aqua_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 29.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/locusta_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 15 | 26.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/locusta_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 32 | 47.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/locusta_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/locusta_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | smile, solo, 1girl, looking_at_viewer, sharp_teeth, open_mouth, holding, blush, mushroom, navel, gloves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | smile | solo | 1girl | looking_at_viewer | sharp_teeth | open_mouth | holding | blush | mushroom | navel | gloves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------------|:-------------|:----------|:--------|:-----------|:--------|:---------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
Saulons3/cocina | ---
license: apache-2.0
---
|
andykcheng/presidents | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: student
dtype: string
- name: score
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 25358
num_examples: 20
download_size: 12194
dataset_size: 25358
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mHossain/text_summary_v3 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 234990.0
num_examples: 45
- name: test
num_bytes: 26110.0
num_examples: 5
download_size: 183680
dataset_size: 261100.0
---
# Dataset Card for "text_summary_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nadsoft/Podcasts-all-info | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: quality
dtype: string
- name: Gender
dtype: string
- name: external_audio
dtype: string
splits:
- name: train
num_bytes: 571178438.145
num_examples: 4545
- name: test
num_bytes: 58907351.4
num_examples: 505
download_size: 616913825
dataset_size: 630085789.545
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/kochou_kanae_demonslayer | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kochou_kanae (Kimetsu no Yaiba)
This is the dataset of kochou_kanae (Kimetsu no Yaiba), containing 142 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
ccaligned_multilingual | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- af
- ak
- am
- ar
- as
- ay
- az
- be
- bg
- bm
- bn
- br
- bs
- ca
- ceb
- ckb
- cs
- cy
- de
- dv
- el
- eo
- es
- fa
- ff
- fi
- fo
- fr
- fy
- ga
- gl
- gn
- gu
- he
- hi
- hr
- hu
- id
- ig
- is
- it
- iu
- ja
- ka
- kac
- kg
- kk
- km
- kn
- ko
- ku
- ky
- la
- lg
- li
- ln
- lo
- lt
- lv
- mg
- mi
- mk
- ml
- mn
- mr
- ms
- mt
- my
- ne
- nl
- 'no'
- nso
- ny
- om
- or
- pa
- pl
- ps
- pt
- rm
- ro
- ru
- rw
- sc
- sd
- se
- shn
- si
- sk
- sl
- sn
- so
- sq
- sr
- ss
- st
- su
- sv
- sw
- syc
- szl
- ta
- te
- tg
- th
- ti
- tl
- tn
- tr
- ts
- tt
- ug
- uk
- ur
- uz
- ve
- vi
- war
- wo
- xh
- yi
- yo
- zgh
- zh
- zu
- zza
license:
- unknown
multilinguality:
- translation
size_categories:
- n<1K
- 1K<n<10K
- 10K<n<100K
- 100K<n<1M
- 1M<n<10M
- 10M<n<100M
source_datasets:
- original
task_categories:
- other
paperswithcode_id: ccaligned
pretty_name: CCAligned
dataset_info:
- config_name: documents-zz_TR
features:
- name: Domain
dtype: string
- name: Source_URL
dtype: string
- name: Target_URL
dtype: string
- name: translation
dtype:
translation:
languages:
- en_XX
- zz_TR
splits:
- name: train
num_bytes: 641412
num_examples: 41
download_size: 125488
dataset_size: 641412
- config_name: sentences-zz_TR
features:
- name: translation
dtype:
translation:
languages:
- en_XX
- zz_TR
- name: LASER_similarity
dtype: float32
splits:
- name: train
num_bytes: 4056
num_examples: 34
download_size: 1428
dataset_size: 4056
- config_name: documents-tz_MA
features:
- name: Domain
dtype: string
- name: Source_URL
dtype: string
- name: Target_URL
dtype: string
- name: translation
dtype:
translation:
languages:
- en_XX
- tz_MA
splits:
- name: train
num_bytes: 51782
num_examples: 4
download_size: 11996
dataset_size: 51782
- config_name: sentences-tz_MA
features:
- name: translation
dtype:
translation:
languages:
- en_XX
- tz_MA
- name: LASER_similarity
dtype: float32
splits:
- name: train
num_bytes: 6256
num_examples: 33
download_size: 2420
dataset_size: 6256
- config_name: documents-ak_GH
features:
- name: Domain
dtype: string
- name: Source_URL
dtype: string
- name: Target_URL
dtype: string
- name: translation
dtype:
translation:
languages:
- en_XX
- ak_GH
splits:
- name: train
num_bytes: 10738312
num_examples: 249
download_size: 399236
dataset_size: 10738312
- config_name: sentences-ak_GH
features:
- name: translation
dtype:
translation:
languages:
- en_XX
- ak_GH
- name: LASER_similarity
dtype: float32
splits:
- name: train
num_bytes: 50110
num_examples: 478
download_size: 17636
dataset_size: 50110
---
# Dataset Card for ccaligned_multilingual
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://www.statmt.org/cc-aligned/
- **Repository:** [Needs More Information]
- **Paper:** https://www.aclweb.org/anthology/2020.emnlp-main.480.pdf
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
CCAligned consists of parallel or comparable web-document pairs in 137 languages aligned with English. These web-document pairs were constructed by performing language identification on raw web-documents, and ensuring corresponding language codes were corresponding in the URLs of web documents. This pattern matching approach yielded more than 100 million aligned documents paired with English. Recognizing that each English document was often aligned to mulitple documents in different target language, we can join on English documents to obtain aligned documents that directly pair two non-English documents (e.g., Arabic-French). This corpus was created from 68 Commoncrawl Snapshots.
To load a language which isn't part of the config, all you need to do is specify the language code. You can find the valid languages in http://www.statmt.org/cc-aligned/ E.g.
```
dataset = load_dataset("ccaligned_multilingual", language_code="fr_XX", type="documents")
```
or
```
dataset = load_dataset("ccaligned_multilingual", language_code="fr_XX", type="sentences")
```
### Supported Tasks and Leaderboards
[Needs More Information]
### Languages
The text in the dataset is in (137) multiple languages aligned with english.
## Dataset Structure
### Data Instances
An instance of `documents` type for language `ak_GH`:
```
{'Domain': 'islamhouse.com', 'Source_URL': 'https://islamhouse.com/en/audios/373088/', 'Target_URL': 'https://islamhouse.com/ak/audios/373088/', 'translation': {'ak_GH': "Ntwatiaa / wɔabɔ no tɔfa wɔ mu no te ase ma Umrah - Arab kasa|Islamhouse.com|Follow us:|facebook|twitter|taepe|Titles All|Fie wibesite|kasa nyina|Buukuu edi adanse ma prente|Nhyehyɛmu|Nyim/sua Islam|Curriculums|Nyina ndeɛma|Nyina ndeɛma (295)|Buukuu/ nwoma (2)|sini / muuvi (31)|ɔdio (262)|Aɛn websideNew!|Kɔ wura kramosom mu seisei|Ebio|figa/kaasɛ|Farebae|AKAkan|Kratafa titriw|kasa interface( anyimu) : Akan|Kasa ma no mu-nsɛm : Arab kasa|ɔdio|Ntwatiaa / wɔabɔ no tɔfa wɔ mu no te ase ma Umrah|play|pause|stop|mute|unmute|max volume|Kasakyerɛ ni :|Farebae:|17 / 11 / 1432 , 15/10/2011|Nhyehyɛmu:|Jurisprudence/ Esum Nimdea|Som|Hajj na Umrah|Jurisprudence/ Esum Nimdea|Som|Hajj na Umrah|Mmira ma Hajj na Umrah|nkyerɛmu|kasamu /sɛntɛns ma te ase na Umrah wɔ ... mu no hann ma no Quran na Sunnah na te ase ma no nana na no kasamu /sɛntɛns ma bi ma no emerging yi adu obusuani|Akenkane we ye di ko kasa bi su (36)|Afar - Qafár afa|Akan|Amhari ne - አማርኛ|Arab kasa - عربي|Assamese - অসমীয়া|Bengali - বাংলা|Maldive - ދިވެހި|Greek - Ελληνικά|English ( brofo kasa) - English|Persian - فارسی|Fula - pulla|French - Français|Hausa - Hausa|Kurdish - كوردی سۆرانی|Uganda ne - Oluganda|Mandinka - Mandinko|Malayalam - മലയാളം|Nepali - नेपाली|Portuguese - Português|Russian - Русский|Sango - Sango|Sinhalese - සිංහල|Somali - Soomaali|Albania ne - Shqip|Swahili - Kiswahili|Telugu - తెలుగు ప్రజలు|Tajik - Тоҷикӣ|Thai - ไทย|Tagalog - Tagalog|Turkish - Türkçe|Uyghur - ئۇيغۇرچە|Urdu - اردو|Uzbeck ne - Ўзбек тили|Vietnamese - Việt Nam|Wolof - Wolof|Chine ne - 中文|Soma kɔ bi kyerɛ adwen kɔ wɛb ebusuapanin|Soma kɔ ne kɔ hom adamfo|Soma kɔ bi kyerɛ adwen kɔ wɛb ebusuapanin|Nsɔwso fael (1)|1|الموجز في فقه العمرة|MP3 14.7 MB|Enoumah ebatahu|Rituals/Esom ajomadie ewu Hajji mmire .. 1434 AH [01] no fapemso Enum|Fiidbak/ Ye hiya wu jun kyiri|Lenke de yɛe|kɔntakt yɛn|Aɛn webside|Qura'an Kro kronkrom|Balagh|wɔ mfinimfin Dowload faele|Yɛ atuu bra Islam mu afei|Tsin de yɛe ewu|Anaa bomu/combine hɛn melin liste|© Islamhouse Website/ Islam dan webi site|×|×|Yi mu kasa|", 'en_XX': 'SUMMARY in the jurisprudence of Umrah - Arabic - Abdul Aziz Bin Marzooq Al-Turaifi|Islamhouse.com|Follow us:|facebook|twitter|QuranEnc.com|HadeethEnc.com|Type|Titles All|Home Page|All Languages|Categories|Know about Islam|All items|All items (4057)|Books (701)|Articles (548)|Fatawa (370)|Videos (1853)|Audios (416)|Posters (98)|Greeting cards (22)|Favorites (25)|Applications (21)|Desktop Applications (3)|To convert to Islam now !|More|Figures|Sources|Curriculums|Our Services|QuranEnc.com|HadeethEnc.com|ENEnglish|Main Page|Interface Language : English|Language of the content : Arabic|Audios|تعريب عنوان المادة|SUMMARY in the jurisprudence of Umrah|play|pause|stop|mute|unmute|max volume|Lecturer : Abdul Aziz Bin Marzooq Al-Turaifi|Sources:|AlRaya Islamic Recoding in Riyadh|17 / 11 / 1432 , 15/10/2011|Categories:|Islamic Fiqh|Fiqh of Worship|Hajj and Umrah|Islamic Fiqh|Fiqh of Worship|Hajj and Umrah|Pilgrimage and Umrah|Description|SUMMARY in jurisprudence of Umrah: A statement of jurisprudence and Umrah in the light of the Quran and Sunnah and understanding of the Ancestors and the statement of some of the emerging issues related to them.|This page translated into (36)|Afar - Qafár afa|Akane - Akan|Amharic - አማርኛ|Arabic - عربي|Assamese - অসমীয়া|Bengali - বাংলা|Maldivi - ދިވެހި|Greek - Ελληνικά|English|Persian - فارسی|Fula - pulla|French - Français|Hausa - Hausa|kurdish - كوردی سۆرانی|Ugandan - Oluganda|Mandinka - Mandinko|Malayalam - മലയാളം|Nepali - नेपाली|Portuguese - Português|Russian - Русский|Sango - Yanga ti Sango|Sinhalese - සිංහල|Somali - Soomaali|Albanian - Shqip|Swahili - Kiswahili|Telugu - తెలుగు|Tajik - Тоҷикӣ|Thai - ไทย|Tagalog - Tagalog|Turkish - Türkçe|Uyghur - ئۇيغۇرچە|Urdu - اردو|Uzbek - Ўзбек тили|Vietnamese - Việt Nam|Wolof - Wolof|Chinese - 中文|Send a comment to Webmaster|Send to a friend?|Send a comment to Webmaster|Attachments (1)|1|الموجز في فقه العمرة|MP3 14.7 MB|The relevant Material|The rituals of the pilgrimage season .. 1434 AH [ 01] the fifth pillar|The Quality of the Accepted Hajj (Piligrimage) and Its Limitations|Easy Path to the Rules of the Rites of Hajj|A Call to the Pilgrims of the Scared House of Allah|More|feedback|Important links|Contact us|Privacy policy|Islam Q&A|Learning Arabic Language|About Us|Convert To Islam|Noble Quran encyclopedia|IslamHouse.com Reader|Encyclopedia of Translated Prophetic Hadiths|Our Services|The Quran|Balagh|Center for downloading files|To embrace Islam now...|Follow us through|Or join our mailing list.|© Islamhouse Website|×|×|Choose language|'}}
```
An instance of `sentences` type for language `ak_GH`:
```
{'LASER_similarity': 1.4549942016601562, 'translation': {'ak_GH': 'Salah (nyamefere) ye Mmerebeia', 'en_XX': 'What he dislikes when fasting (10)'}}
```
### Data Fields
For `documents` type:
- `Domain`: a `string` feature containing the domain.
- `Source_URL`: a `string` feature containing the source URL.
- `Target_URL`: a `string` feature containing the target URL.
- `translation`: a `dictionary` feature with two keys :
- `en_XX`: a `string` feature containing the content in English.
- <language_code>: a `string` feature containing the content in the `language_code` specified.
For `sentences` type:
- `LASER_similarity`: a `float32` feature representing the LASER similarity score.
- `translation`: a `dictionary` feature with two keys :
- `en_XX`: a `string` feature containing the content in English.
- <language_code>: a `string` feature containing the content in the `language_code` specified.
### Data Splits
Split sizes of some small configurations:
| name |train|
|----------|----:|
|documents-zz_TR|41|
|sentences-zz_TR|34|
|documents-tz_MA|4|
|sentences-tz_MA|33|
|documents-ak_GH|249|
|sentences-ak_GH|478|
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
The dataset consists of people who have donated their voice online. You agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
```
@inproceedings{elkishky_ccaligned_2020,
author = {El-Kishky, Ahmed and Chaudhary, Vishrav and Guzm{\'a}n, Francisco and Koehn, Philipp},
booktitle = {Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP 2020)},
month = {November},
title = {{CCAligned}: A Massive Collection of Cross-lingual Web-Document Pairs},
year = {2020}
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-main.480",
doi = "10.18653/v1/2020.emnlp-main.480",
pages = "5960--5969"
}
```
### Contributions
Thanks to [@gchhablani](https://github.com/gchhablani) for adding this dataset. |
Falah/mathematical_fashion_style_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 522685
num_examples: 1627
download_size: 0
dataset_size: 522685
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Mathematical Fashion Style Prompts"
## Dataset Info
This dataset contains prompts for artists to create artistic content based on mathematical terms and math style.
### Features
- `prompts`: A string containing the artistic prompts related to mathematical terms and math style.
### Splits
- `train`:
- Number of examples: 1627
- Size in bytes: 522,685
### Download Size
The dataset can be downloaded free of charge.
### Dataset Size
The total size of the dataset is 522,685 bytes.
## Configurations
- Config Name: default
- Data Files:
- Split: train
- Path: data/train-*
## Usage
This dataset provides creative prompts for artists to explore the intersection between mathematics and artistic expression.
The prompts can serve as a source of inspiration for creating unique and imaginative artwork.
Artists can interpret the mathematical terms and math style in their own creative ways,
resulting in a diverse and fascinating range of artistic outputs.

------------------------------------

-----------------------------------------------------

## Citation
If you use this dataset in your research or any other work, please cite it as follows:
```
@misc{falah_g_salieh_2023,
title = {Mathematical Fashion Style Prompts Dataset},
author = {Falah G. Salieh},
year = {2023},
publisher = {HuggingFace Hub},
url = {\url{https://huggingface.co/datasets/Falah/mathematical_fashion_style_prompts/}}
}
```
## License
This dataset is made available under the Apache License 2.0. Please refer to the LICENSE file provided in the dataset for the complete license terms.
---
|
open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B | ---
pretty_name: Evaluation run of Yuma42/KangalKhan-Sapphire-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yuma42/KangalKhan-Sapphire-7B](https://huggingface.co/Yuma42/KangalKhan-Sapphire-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-15T11:23:46.531546](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B/blob/main/results_2024-02-15T11-23-46.531546.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6358309379887943,\n\
\ \"acc_stderr\": 0.03223739791661606,\n \"acc_norm\": 0.637443452667424,\n\
\ \"acc_norm_stderr\": 0.032881114830308686,\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5609459047030728,\n\
\ \"mc2_stderr\": 0.015392383178013528\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893456,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902276\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6666998605855408,\n\
\ \"acc_stderr\": 0.004704293898729911,\n \"acc_norm\": 0.853415654252141,\n\
\ \"acc_norm_stderr\": 0.0035296822858572325\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.042849586397534015,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.042849586397534015\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878934,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878934\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.329608938547486,\n\
\ \"acc_stderr\": 0.01572153107518387,\n \"acc_norm\": 0.329608938547486,\n\
\ \"acc_norm_stderr\": 0.01572153107518387\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.01274307294265334,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.01274307294265334\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462937,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462937\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013007,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013007\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3929008567931457,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.5609459047030728,\n\
\ \"mc2_stderr\": 0.015392383178013528\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7813733228097869,\n \"acc_stderr\": 0.011616198215773234\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6194086429112965,\n \
\ \"acc_stderr\": 0.013373971277729817\n }\n}\n```"
repo_url: https://huggingface.co/Yuma42/KangalKhan-Sapphire-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|arc:challenge|25_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|gsm8k|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hellaswag|10_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-15T11-23-46.531546.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- '**/details_harness|winogrande|5_2024-02-15T11-23-46.531546.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-15T11-23-46.531546.parquet'
- config_name: results
data_files:
- split: 2024_02_15T11_23_46.531546
path:
- results_2024-02-15T11-23-46.531546.parquet
- split: latest
path:
- results_2024-02-15T11-23-46.531546.parquet
---
# Dataset Card for Evaluation run of Yuma42/KangalKhan-Sapphire-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-Sapphire-7B](https://huggingface.co/Yuma42/KangalKhan-Sapphire-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-15T11:23:46.531546](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-Sapphire-7B/blob/main/results_2024-02-15T11-23-46.531546.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6358309379887943,
"acc_stderr": 0.03223739791661606,
"acc_norm": 0.637443452667424,
"acc_norm_stderr": 0.032881114830308686,
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5609459047030728,
"mc2_stderr": 0.015392383178013528
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893456,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902276
},
"harness|hellaswag|10": {
"acc": 0.6666998605855408,
"acc_stderr": 0.004704293898729911,
"acc_norm": 0.853415654252141,
"acc_norm_stderr": 0.0035296822858572325
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.042849586397534015,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.042849586397534015
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878934,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878934
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.329608938547486,
"acc_stderr": 0.01572153107518387,
"acc_norm": 0.329608938547486,
"acc_norm_stderr": 0.01572153107518387
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265334,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265334
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462937,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462937
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013007,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013007
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3929008567931457,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.5609459047030728,
"mc2_stderr": 0.015392383178013528
},
"harness|winogrande|5": {
"acc": 0.7813733228097869,
"acc_stderr": 0.011616198215773234
},
"harness|gsm8k|5": {
"acc": 0.6194086429112965,
"acc_stderr": 0.013373971277729817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mar-yam1497/HotPotQA_Mistral_Instruct_dataset_Top3k_Revised | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 15524892
num_examples: 3000
download_size: 6915713
dataset_size: 15524892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iamkaikai/CLASSICAL-ART | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 8615731.0
num_examples: 219
download_size: 8580828
dataset_size: 8615731.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nateraw/snowflaketest | ---
license: mit
---
|
jaygala223/38-cloud-dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 757246236.0
num_examples: 8400
download_size: 754389599
dataset_size: 757246236.0
---
# Dataset Card for "38-cloud-train-only-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rbx-imarcin/llama2-ft-test-dataset | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
pretty_name: llama2-ft-test-dataset
size_categories:
- n<1K
--- |
fazeelzafar/Test-Text2Code-Java | ---
task_categories:
- text-classification
- question-answering
size_categories:
- 100K<n<1M
--- |
tanganke/EuroSAT | ---
task_categories:
- image-classification
---
# EuroSAT
EuroSAT: Downloaded from https://github.com/phelber/EuroSAT (direct link: https://madm.dfki.de/files/sentinel/EuroSAT.zip).
For this dataset we randomly split the downloaded data into train/validation/test (21,600/2,700/2,700 samples). |
Anwarkh1/ISIC_2019_Training_Input | ---
license: apache-2.0
dataset_info:
features:
- name: image
dtype: image
- name: dx
dtype: string
- name: age_approx
dtype: float64
- name: anatom_site_general
dtype: string
- name: lesion_id
dtype: string
- name: sex
dtype: string
splits:
- name: train
num_bytes: 4789886766.62
num_examples: 25331
download_size: 9801543287
dataset_size: 4789886766.62
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
moonmelonpizza/constitution_of_india | ---
license: unknown
---
|
arize-ai/xtreme_en_language_drift_es | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
pretty_name: named-entity-recognition-en-no-drift
size_categories:
- 10K<n<100K
source_datasets:
- extended|xtreme
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset Card for `reviews_with_drift`
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
This dataset was crafted to be used in our tutorial [Link to the tutorial when ready]. It consists on a large Movie Review Dataset mixed with some reviews from a Hotel Review Dataset. The training/validation set are purely obtained from the Movie Review Dataset while the production set is mixed. Some other features have been added (`age`, `gender`, `context`) as well as a made up timestamp `prediction_ts` of when the inference took place.
### Supported Tasks and Leaderboards
`text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the text, predict the sentiment (positive or negative).
### Languages
Text is mainly written in english.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@fjcasti1](https://github.com/fjcasti1) for adding this dataset. |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_199 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1099627584.0
num_examples: 215952
download_size: 1120568825
dataset_size: 1099627584.0
---
# Dataset Card for "chunk_199"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
alarmod/forest_fire | ---
license: gpl-3.0
---
The initial pictures from UAV (FLAME dataset, file 9) with the size 3840 × 2160 were splitted in nine non-overlapping parts, each such part having a size of 1280 × 720 pixels. Information about locations with fires was extracted from masks, prepared by developers of primary dataset for segmentation (FLAME, file 10), then, with the help of marking algorithm of connected components the data about points with fire were extracted: information about their sizes and coordinates are getting saved in text files. Further, the fragments of snaps with information about the exact location of place with fire are submitted to ANN. In order to form both the test images and training ones from one dataset, the breakdown of processed data into batches with 90 images each is made. The first packet goes for training, the second one is purposefully discarded, the third one is forwarded for testing, and the fourth one is discarded. This is done on purpose, in order to ensure that the tests and training set will be from video stream’s frames different in time. Eventually, training and test sets appeared to have 4500 images each (with textual description of fire zones). Respectively, 2601 and 2604 of them – images of forest without places with fire.
Thanks to
_Shamsoshoara, A.; Afghah, F.; Razi, A.; Zheng, L.; Fulé, P. The Flame Dataset: Aerial Imagery Pile Burn Detection Using Drones (UAVS). 2021. Available online: https://ieee-dataport.org/open-access/flame-dataset-aerial-imagery-pile-burn-detection-using-drones-uavs, doi: 10.21227/qad6-r683_ |
autoevaluate/autoeval-eval-conll2003-conll2003-bc26c9-1485554291 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- conll2003
eval_info:
task: entity_extraction
model: chandrasutrisnotjhong/bert-finetuned-ner
metrics: []
dataset_name: conll2003
dataset_config: conll2003
dataset_split: test
col_mapping:
tokens: tokens
tags: ner_tags
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Token Classification
* Model: chandrasutrisnotjhong/bert-finetuned-ner
* Dataset: conll2003
* Config: conll2003
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
tnhyk/abx | ---
license: other
---
|
raygx/NepCov19TweetsPlus | ---
dataset_info:
features:
- name: Sentiment
dtype: int64
- name: Sentences
dtype: string
splits:
- name: train
num_bytes: 14110875
num_examples: 41541
download_size: 5219950
dataset_size: 14110875
---
# Dataset Card for "NepCov19TweetsPlus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
seongwoon/labor_market_context_data | ---
license: cc-by-nc-nd-4.0
---
|
Ssunbell/boostcamp-docvqa-v5-test | ---
dataset_info:
features:
- name: questionId
dtype: int64
- name: question
dtype: string
- name: image
sequence:
sequence:
sequence: uint8
- name: docId
dtype: int64
- name: ucsf_document_id
dtype: string
- name: ucsf_document_page_no
dtype: string
- name: data_split
dtype: string
- name: words
sequence: string
- name: boxes
sequence:
sequence: int64
splits:
- name: test
num_bytes: 843083964
num_examples: 5188
download_size: 296859136
dataset_size: 843083964
---
# Dataset Card for "boostcamp-docvqa-v5-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter | ---
pretty_name: Evaluation run of KoboldAI/LLaMA2-13B-Tiefighter
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KoboldAI/LLaMA2-13B-Tiefighter](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-14T20:25:09.144693](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public/blob/main/results_2023-11-14T20-25-09.144693.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5460312696004332,\n\
\ \"acc_stderr\": 0.03357446611113244,\n \"acc_norm\": 0.5555362057698711,\n\
\ \"acc_norm_stderr\": 0.03444530254256153,\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5301656358073983,\n\
\ \"mc2_stderr\": 0.01568757011022921,\n \"em\": 0.11115771812080537,\n\
\ \"em_stderr\": 0.00321900621779521,\n \"f1\": 0.1838915687919454,\n\
\ \"f1_stderr\": 0.0033646558993111948\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6500697072296355,\n\
\ \"acc_stderr\": 0.004759729267943188,\n \"acc_norm\": 0.8399721171081458,\n\
\ \"acc_norm_stderr\": 0.003658826208101615\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.04793724854411021,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.04793724854411021\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523857,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523857\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n\
\ \"acc_stderr\": 0.027327548447957536,\n \"acc_norm\": 0.6387096774193548,\n\
\ \"acc_norm_stderr\": 0.027327548447957536\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406795,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406795\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.702020202020202,\n \"acc_stderr\": 0.03258630383836556,\n \"acc_norm\"\
: 0.702020202020202,\n \"acc_norm_stderr\": 0.03258630383836556\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7772020725388601,\n \"acc_stderr\": 0.03003114797764154,\n\
\ \"acc_norm\": 0.7772020725388601,\n \"acc_norm_stderr\": 0.03003114797764154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5051282051282051,\n \"acc_stderr\": 0.025349672906838653,\n\
\ \"acc_norm\": 0.5051282051282051,\n \"acc_norm_stderr\": 0.025349672906838653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871937,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871937\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182087,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182087\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7211009174311926,\n \"acc_stderr\": 0.019227468876463507,\n \"\
acc_norm\": 0.7211009174311926,\n \"acc_norm_stderr\": 0.019227468876463507\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37962962962962965,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.37962962962962965,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.0283046579430353,\n\
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.0283046579430353\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7458492975734355,\n\
\ \"acc_stderr\": 0.015569254692045752,\n \"acc_norm\": 0.7458492975734355,\n\
\ \"acc_norm_stderr\": 0.015569254692045752\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.026113749361310345,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.026113749361310345\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33631284916201115,\n\
\ \"acc_stderr\": 0.0158010037291459,\n \"acc_norm\": 0.33631284916201115,\n\
\ \"acc_norm_stderr\": 0.0158010037291459\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.027826109307283697,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.027826109307283697\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n\
\ \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n\
\ \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6080246913580247,\n \"acc_stderr\": 0.027163686038271146,\n\
\ \"acc_norm\": 0.6080246913580247,\n \"acc_norm_stderr\": 0.027163686038271146\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.029233465745573083,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.029233465745573083\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4335071707953064,\n\
\ \"acc_stderr\": 0.012656810383983965,\n \"acc_norm\": 0.4335071707953064,\n\
\ \"acc_norm_stderr\": 0.012656810383983965\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555033,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555033\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030802,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030802\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.379436964504284,\n\
\ \"mc1_stderr\": 0.016987039266142985,\n \"mc2\": 0.5301656358073983,\n\
\ \"mc2_stderr\": 0.01568757011022921\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.11115771812080537,\n \
\ \"em_stderr\": 0.00321900621779521,\n \"f1\": 0.1838915687919454,\n \
\ \"f1_stderr\": 0.0033646558993111948\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.006823351023502654,\n \"acc_stderr\": 0.002267537102254515\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|arc:challenge|25_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|drop|3_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|gsm8k|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hellaswag|10_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-14T20-25-09.144693.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- '**/details_harness|winogrande|5_2023-11-14T20-25-09.144693.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-14T20-25-09.144693.parquet'
- config_name: results
data_files:
- split: 2023_11_14T20_25_09.144693
path:
- results_2023-11-14T20-25-09.144693.parquet
- split: latest
path:
- results_2023-11-14T20-25-09.144693.parquet
---
# Dataset Card for Evaluation run of KoboldAI/LLaMA2-13B-Tiefighter
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KoboldAI/LLaMA2-13B-Tiefighter](https://huggingface.co/KoboldAI/LLaMA2-13B-Tiefighter) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-14T20:25:09.144693](https://huggingface.co/datasets/open-llm-leaderboard/details_KoboldAI__LLaMA2-13B-Tiefighter_public/blob/main/results_2023-11-14T20-25-09.144693.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5460312696004332,
"acc_stderr": 0.03357446611113244,
"acc_norm": 0.5555362057698711,
"acc_norm_stderr": 0.03444530254256153,
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5301656358073983,
"mc2_stderr": 0.01568757011022921,
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779521,
"f1": 0.1838915687919454,
"f1_stderr": 0.0033646558993111948
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719867
},
"harness|hellaswag|10": {
"acc": 0.6500697072296355,
"acc_stderr": 0.004759729267943188,
"acc_norm": 0.8399721171081458,
"acc_norm_stderr": 0.003658826208101615
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523857,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523857
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.027327548447957536,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.027327548447957536
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406795,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406795
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.702020202020202,
"acc_stderr": 0.03258630383836556,
"acc_norm": 0.702020202020202,
"acc_norm_stderr": 0.03258630383836556
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7772020725388601,
"acc_stderr": 0.03003114797764154,
"acc_norm": 0.7772020725388601,
"acc_norm_stderr": 0.03003114797764154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5051282051282051,
"acc_stderr": 0.025349672906838653,
"acc_norm": 0.5051282051282051,
"acc_norm_stderr": 0.025349672906838653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871937,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871937
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182087,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182087
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7211009174311926,
"acc_stderr": 0.019227468876463507,
"acc_norm": 0.7211009174311926,
"acc_norm_stderr": 0.019227468876463507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.0283046579430353,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.0283046579430353
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7458492975734355,
"acc_stderr": 0.015569254692045752,
"acc_norm": 0.7458492975734355,
"acc_norm_stderr": 0.015569254692045752
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.026113749361310345,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.026113749361310345
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33631284916201115,
"acc_stderr": 0.0158010037291459,
"acc_norm": 0.33631284916201115,
"acc_norm_stderr": 0.0158010037291459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283697,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283697
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6080246913580247,
"acc_stderr": 0.027163686038271146,
"acc_norm": 0.6080246913580247,
"acc_norm_stderr": 0.027163686038271146
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.029233465745573083,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.029233465745573083
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4335071707953064,
"acc_stderr": 0.012656810383983965,
"acc_norm": 0.4335071707953064,
"acc_norm_stderr": 0.012656810383983965
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555033,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030802,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030802
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.379436964504284,
"mc1_stderr": 0.016987039266142985,
"mc2": 0.5301656358073983,
"mc2_stderr": 0.01568757011022921
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
},
"harness|drop|3": {
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779521,
"f1": 0.1838915687919454,
"f1_stderr": 0.0033646558993111948
},
"harness|gsm8k|5": {
"acc": 0.006823351023502654,
"acc_stderr": 0.002267537102254515
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
InceptiveDev/Objective-data | ---
license: mit
---
|
ds4sd/FinTabNet_OTSL | ---
license: other
pretty_name: FinTabNet-OTSL
size_categories:
- 10K<n<100K
tags:
- table-structure-recognition
- table-understanding
- PDF
task_categories:
- object-detection
- table-to-text
---
# Dataset Card for FinTabNet_OTSL
## Dataset Description
- **Homepage:** https://ds4sd.github.io
- **Paper:** https://arxiv.org/pdf/2305.03393
### Dataset Summary
This dataset is a conversion of the original [FinTabNet](https://developer.ibm.com/exchanges/data/all/fintabnet/) into the OTSL format presented in our paper "Optimized Table Tokenization for Table Structure Recognition". The dataset includes the original annotations amongst new additions.
### Dataset Structure
* cells: origunal dataset cell groundtruth (content).
* otsl: new reduced table structure token format
* html: original dataset groundtruth HTML (structure).
* html_restored: generated HTML from OTSL.
* cols: grid column length.
* rows: grid row length.
* image: PIL image
### OTSL Vocabulary:
**OTSL**: new reduced table structure token format
More information on the OTSL table structure format and its concepts can be read from our paper.
Format of this dataset extends work presented in a paper, and introduces slight modifications:
* "fcel" - cell that has content in it
* "ecel" - cell that is empty
* "lcel" - left-looking cell (to handle horizontally merged cells)
* "ucel" - up-looking cell (to handle vertically merged cells)
* "xcel" - 2d span cells, in this dataset - covers entire area of a merged cell
* "nl" - new line token
### Data Splits
The dataset provides three splits
- `train`
- `val`
- `test`
## Additional Information
### Dataset Curators
The dataset is converted by the [Deep Search team](https://ds4sd.github.io/) at IBM Research.
You can contact us at [deepsearch-core@zurich.ibm.com](mailto:deepsearch-core@zurich.ibm.com).
Curators:
- Maksym Lysak, [@maxmnemonic](https://github.com/maxmnemonic)
- Ahmed Nassar, [@nassarofficial](https://github.com/nassarofficial)
- Christoph Auer, [@cau-git](https://github.com/cau-git)
- Nikos Livathinos, [@nikos-livathinos](https://github.com/nikos-livathinos)
- Peter Staar, [@PeterStaar-IBM](https://github.com/PeterStaar-IBM)
### Citation Information
```bib
@misc{lysak2023optimized,
title={Optimized Table Tokenization for Table Structure Recognition},
author={Maksym Lysak and Ahmed Nassar and Nikolaos Livathinos and Christoph Auer and Peter Staar},
year={2023},
eprint={2305.03393},
archivePrefix={arXiv},
primaryClass={cs.CV}
}``` |
open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored | ---
pretty_name: Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [athirdpath/Orca-2-13b-Alpaca-Uncensored](https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T14:58:45.053086](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored/blob/main/results_2024-02-17T14-58-45.053086.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6002703838763287,\n\
\ \"acc_stderr\": 0.032941926494755115,\n \"acc_norm\": 0.6047089287996544,\n\
\ \"acc_norm_stderr\": 0.03361522848210774,\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5358987502340753,\n\
\ \"mc2_stderr\": 0.01565060464007792\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6100378410675165,\n\
\ \"acc_stderr\": 0.004867445945277156,\n \"acc_norm\": 0.792670782712607,\n\
\ \"acc_norm_stderr\": 0.004045648954769832\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.036117805602848975,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.036117805602848975\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.71,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7387096774193549,\n\
\ \"acc_stderr\": 0.024993053397764805,\n \"acc_norm\": 0.7387096774193549,\n\
\ \"acc_norm_stderr\": 0.024993053397764805\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630642,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630642\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710855,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710855\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.031429466378837076,\n\
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.031429466378837076\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764377,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764377\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.776500638569604,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.776500638569604,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.025070713719153186,\n\
\ \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.025070713719153186\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n\
\ \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n\
\ \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388852,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388852\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6816720257234726,\n\
\ \"acc_stderr\": 0.026457225067811025,\n \"acc_norm\": 0.6816720257234726,\n\
\ \"acc_norm_stderr\": 0.026457225067811025\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900933,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900933\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4380704041720991,\n\
\ \"acc_stderr\": 0.012671902782567657,\n \"acc_norm\": 0.4380704041720991,\n\
\ \"acc_norm_stderr\": 0.012671902782567657\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235402,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235402\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555402,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555402\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37821297429620565,\n\
\ \"mc1_stderr\": 0.01697633590754687,\n \"mc2\": 0.5358987502340753,\n\
\ \"mc2_stderr\": 0.01565060464007792\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902545\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.38286580742987114,\n \
\ \"acc_stderr\": 0.013389223491820465\n }\n}\n```"
repo_url: https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|arc:challenge|25_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|gsm8k|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hellaswag|10_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T14-58-45.053086.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- '**/details_harness|winogrande|5_2024-02-17T14-58-45.053086.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T14-58-45.053086.parquet'
- config_name: results
data_files:
- split: 2024_02_17T14_58_45.053086
path:
- results_2024-02-17T14-58-45.053086.parquet
- split: latest
path:
- results_2024-02-17T14-58-45.053086.parquet
---
# Dataset Card for Evaluation run of athirdpath/Orca-2-13b-Alpaca-Uncensored
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [athirdpath/Orca-2-13b-Alpaca-Uncensored](https://huggingface.co/athirdpath/Orca-2-13b-Alpaca-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T14:58:45.053086](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__Orca-2-13b-Alpaca-Uncensored/blob/main/results_2024-02-17T14-58-45.053086.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6002703838763287,
"acc_stderr": 0.032941926494755115,
"acc_norm": 0.6047089287996544,
"acc_norm_stderr": 0.03361522848210774,
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5358987502340753,
"mc2_stderr": 0.01565060464007792
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6100378410675165,
"acc_stderr": 0.004867445945277156,
"acc_norm": 0.792670782712607,
"acc_norm_stderr": 0.004045648954769832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.036117805602848975,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.036117805602848975
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7387096774193549,
"acc_stderr": 0.024993053397764805,
"acc_norm": 0.7387096774193549,
"acc_norm_stderr": 0.024993053397764805
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630642,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630642
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710855,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710855
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764377,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764377
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.776500638569604,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.776500638569604,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.025070713719153186,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.025070713719153186
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388852,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388852
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6816720257234726,
"acc_stderr": 0.026457225067811025,
"acc_norm": 0.6816720257234726,
"acc_norm_stderr": 0.026457225067811025
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900933,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900933
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4380704041720991,
"acc_stderr": 0.012671902782567657,
"acc_norm": 0.4380704041720991,
"acc_norm_stderr": 0.012671902782567657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235402,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235402
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555402,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555402
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37821297429620565,
"mc1_stderr": 0.01697633590754687,
"mc2": 0.5358987502340753,
"mc2_stderr": 0.01565060464007792
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902545
},
"harness|gsm8k|5": {
"acc": 0.38286580742987114,
"acc_stderr": 0.013389223491820465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
distilled-from-one-sec-cv12/chunk_28 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 867266944
num_examples: 168992
download_size: 886869889
dataset_size: 867266944
---
# Dataset Card for "chunk_28"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pythainlp/UD_Thai-PUD-prompt | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: test
num_bytes: 475196
num_examples: 1000
download_size: 171576
dataset_size: 475196
license: cc-by-sa-3.0
task_categories:
- text2text-generation
- text-generation
language:
- th
size_categories:
- n<1K
---
# Dataset Card for "UD_Thai-PUD-prompt"
This dataset is the test set from the Parallel Universal Dependencies (PUD) treebanks.
See more [https://github.com/UniversalDependencies/UD_Thai-PUD](https://github.com/UniversalDependencies/UD_Thai-PUD)
## Template
```
Inputs: จงสร้างประโยคตามโครงสร้าง {pos}:
Targets: Thai sentence
```
pos: [All tag](https://universaldependencies.org/u/pos/)
Source code for create dataset: [https://github.com/PyThaiNLP/support-aya-datasets/blob/main/pos/ud_pud_thai.ipynb](https://github.com/PyThaiNLP/support-aya-datasets/blob/main/pos/ud_pud_thai.ipynb)
|
alphalm/gt1_8kElo_all_tokenized-v1 | ---
license: mit
---
Revisions to `alphalm/gt1_8kElo_all_tokenized`:
- model_max_length: 8192 v0 -> 4096 v1
- Only add eos_token to checkmate games |
CyberHarem/hamakaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hamakaze/浜風 (Kantai Collection)
This is the dataset of hamakaze/浜風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `short_hair, blue_eyes, grey_hair, hair_ornament, hairclip, hair_over_one_eye, breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 507.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 324.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1189 | 681.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 464.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1189 | 900.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hamakaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hamakaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, simple_background, solo, upper_body, white_background, sailor_collar, yellow_neckerchief, blush, short_sleeves, smile |
| 1 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, serafuku, short_sleeves, solo, yellow_neckerchief, blush, upper_body, white_gloves, smile, simple_background, collarbone |
| 2 | 14 |  |  |  |  |  | 1girl, black_pantyhose, pleated_skirt, serafuku, short_sleeves, solo, simple_background, white_background, yellow_neckerchief, white_gloves, looking_at_viewer, grey_skirt, blush, eyes_visible_through_hair, blue_sailor_collar, sitting |
| 3 | 10 |  |  |  |  |  | 1girl, serafuku, short_sleeves, solo, white_background, white_gloves, black_pantyhose, pleated_skirt, simple_background, looking_at_viewer |
| 4 | 11 |  |  |  |  |  | 1girl, black_pantyhose, pleated_skirt, serafuku, short_sleeves, solo, white_gloves, looking_at_viewer, neckerchief, smile, blush |
| 5 | 5 |  |  |  |  |  | 1girl, black_pantyhose, looking_at_viewer, panties_under_pantyhose, serafuku, short_sleeves, solo, white_gloves, pleated_skirt, blush |
| 6 | 7 |  |  |  |  |  | 1girl, alternate_costume, blush, looking_at_viewer, simple_background, solo, ribbed_sweater, turtleneck, white_background, long_sleeves, black_pantyhose, dress, open_mouth |
| 7 | 10 |  |  |  |  |  | 1girl, cleavage, looking_at_viewer, solo, blush, simple_background, collarbone, white_background, navel, eyes_visible_through_hair, halterneck, side-tie_bikini_bottom, white_bikini |
| 8 | 10 |  |  |  |  |  | 1girl, day, outdoors, solo, cleavage, cloud, ocean, looking_at_viewer, beach, collarbone, navel, side-tie_bikini_bottom, blue_sky, smile, blush, cowboy_shot, sitting, tree, water, white_bikini |
| 9 | 26 |  |  |  |  |  | 1girl, solo, yukata, looking_at_viewer, hair_flower, obi, food, blush, squid |
| 10 | 8 |  |  |  |  |  | 1girl, cat_ears, solo, blush, cat_cutout, cat_lingerie, cleavage_cutout, looking_at_viewer, underwear_only, black_bra, black_panties, cat_ear_panties, navel, side-tie_panties, simple_background, white_background, cat_tail, choker, collarbone, jingle_bell, neck_bell, eyes_visible_through_hair, fake_animal_ears, kemonomimi_mode, wavy_mouth, white_hair |
| 11 | 5 |  |  |  |  |  | 1girl, blush, cleavage, looking_at_viewer, maid_headdress, solo, enmaided, frills, simple_background, eyes_visible_through_hair, upper_body, white_background, cosplay, detached_sleeves, hair_ribbon, hand_on_own_chest, maid_apron, roswaal_mansion_maid_uniform, x_hair_ornament |
| 12 | 6 |  |  |  |  |  | 1girl, fake_animal_ears, looking_at_viewer, playboy_bunny, rabbit_ears, solo, black_leotard, black_pantyhose, bowtie, cleavage, detached_collar, simple_background, wrist_cuffs, bare_shoulders, blush, white_background, rabbit_tail |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | serafuku | simple_background | solo | upper_body | white_background | sailor_collar | yellow_neckerchief | blush | short_sleeves | smile | white_gloves | collarbone | black_pantyhose | pleated_skirt | grey_skirt | eyes_visible_through_hair | blue_sailor_collar | sitting | neckerchief | panties_under_pantyhose | alternate_costume | ribbed_sweater | turtleneck | long_sleeves | dress | open_mouth | cleavage | navel | halterneck | side-tie_bikini_bottom | white_bikini | day | outdoors | cloud | ocean | beach | blue_sky | cowboy_shot | tree | water | yukata | hair_flower | obi | food | squid | cat_ears | cat_cutout | cat_lingerie | cleavage_cutout | underwear_only | black_bra | black_panties | cat_ear_panties | side-tie_panties | cat_tail | choker | jingle_bell | neck_bell | fake_animal_ears | kemonomimi_mode | wavy_mouth | white_hair | maid_headdress | enmaided | frills | cosplay | detached_sleeves | hair_ribbon | hand_on_own_chest | maid_apron | roswaal_mansion_maid_uniform | x_hair_ornament | playboy_bunny | rabbit_ears | black_leotard | bowtie | detached_collar | wrist_cuffs | bare_shoulders | rabbit_tail |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------------------|:-----------|:--------------------|:-------|:-------------|:-------------------|:----------------|:---------------------|:--------|:----------------|:--------|:---------------|:-------------|:------------------|:----------------|:-------------|:----------------------------|:---------------------|:----------|:--------------|:--------------------------|:--------------------|:-----------------|:-------------|:---------------|:--------|:-------------|:-----------|:--------|:-------------|:-------------------------|:---------------|:------|:-----------|:--------|:--------|:--------|:-----------|:--------------|:-------|:--------|:---------|:--------------|:------|:-------|:--------|:-----------|:-------------|:---------------|:------------------|:-----------------|:------------|:----------------|:------------------|:-------------------|:-----------|:---------|:--------------|:------------|:-------------------|:------------------|:-------------|:-------------|:-----------------|:-----------|:---------|:----------|:-------------------|:--------------|:--------------------|:-------------|:-------------------------------|:------------------|:----------------|:--------------|:----------------|:---------|:------------------|:--------------|:-----------------|:--------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | | X | | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | X | X | X | | X | | | | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 11 |  |  |  |  |  | X | X | X | | X | | | | | X | X | X | X | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | X | | | | | X | X | | X | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | | X | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 10 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | X | | | | X | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | | | X | | | | | X | | X | | X | | | | | | X | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 26 |  |  |  |  |  | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 10 | 8 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | X | | | | X | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 11 | 5 |  |  |  |  |  | X | X | | X | X | X | X | | | X | | | | | | | | X | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 12 | 6 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
cqlizhijun/ks | ---
license: mit
---
|
Fiacre/test-animal-poses-controlnet-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: conditioning_image
dtype: image
- name: overlaid
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 1730245.0
num_examples: 21
download_size: 0
dataset_size: 1730245.0
---
# Dataset Card for "test-animal-poses-controlnet-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.