datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
4naluvs/BEOMGYUv4 | ---
license: openrail
---
|
tr416/dataset_20231007_033301 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 762696.0
num_examples: 297
- name: test
num_bytes: 7704.0
num_examples: 3
download_size: 74254
dataset_size: 770400.0
---
# Dataset Card for "dataset_20231007_033301"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Delius/first | ---
license: apache-2.0
language:
- zh
size_categories:
- 10K<n<100K
---
This dataset aims to complete text extension task.
Instance:
{
"Instruction": "请将给你的文本内容扩写成新闻",
"Input": "杨幂获杂志主编夸奖:大幂幂瘦了越来越漂亮",
"Response": "网易娱乐4月29日报道 4月29日,时尚主编张宇晒出一组与杨幂的自拍合影,并发文:“今天与杨幂做同事,大幂幂瘦了,越来越漂亮。”照片中,杨幂托腮卖萌,瘦了不少,少女味十足。网友们也被杨幂的造型征服,纷纷表示:“太美了,看上去年轻了10岁。”"
} |
chiennv/mini-ultrachat | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 95562075.8
num_examples: 10000
download_size: 39734710
dataset_size: 95562075.8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vvtq/toy | ---
dataset_info:
features:
- name: image
dtype: image
- name: noised
dtype: image
- name: image_caption
dtype: string
splits:
- name: train
num_bytes: 6839293.0
num_examples: 6
download_size: 5138950
dataset_size: 6839293.0
---
# Dataset Card for "toy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_nbeerbower__bruphin-epsilon | ---
pretty_name: Evaluation run of nbeerbower/bruphin-epsilon
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/bruphin-epsilon](https://huggingface.co/nbeerbower/bruphin-epsilon)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__bruphin-epsilon\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-25T05:49:20.264803](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bruphin-epsilon/blob/main/results_2024-01-25T05-49-20.264803.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6562434752388866,\n\
\ \"acc_stderr\": 0.03198900028362337,\n \"acc_norm\": 0.6555271311464355,\n\
\ \"acc_norm_stderr\": 0.0326584820786784,\n \"mc1\": 0.5275397796817626,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.669482738361527,\n\
\ \"mc2_stderr\": 0.01527115945822096\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6996587030716723,\n \"acc_stderr\": 0.013395909309957004,\n\
\ \"acc_norm\": 0.7209897610921502,\n \"acc_norm_stderr\": 0.013106784883601327\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7137024497112129,\n\
\ \"acc_stderr\": 0.0045110633512787015,\n \"acc_norm\": 0.8809002190798646,\n\
\ \"acc_norm_stderr\": 0.003232439139881554\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.02983796238829194,\n \
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.02983796238829194\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.015555802713590172,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.015555802713590172\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n\
\ \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n\
\ \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n\
\ \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.023176298203992002,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.023176298203992002\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.016553287863116037,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.016553287863116037\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658533,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658533\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n\
\ \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5275397796817626,\n\
\ \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.669482738361527,\n\
\ \"mc2_stderr\": 0.01527115945822096\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8382004735595896,\n \"acc_stderr\": 0.010350128010292404\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954772\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/bruphin-epsilon
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-49-20.264803.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-25T05-49-20.264803.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- '**/details_harness|winogrande|5_2024-01-25T05-49-20.264803.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-25T05-49-20.264803.parquet'
- config_name: results
data_files:
- split: 2024_01_25T05_49_20.264803
path:
- results_2024-01-25T05-49-20.264803.parquet
- split: latest
path:
- results_2024-01-25T05-49-20.264803.parquet
---
# Dataset Card for Evaluation run of nbeerbower/bruphin-epsilon
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/bruphin-epsilon](https://huggingface.co/nbeerbower/bruphin-epsilon) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__bruphin-epsilon",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-25T05:49:20.264803](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bruphin-epsilon/blob/main/results_2024-01-25T05-49-20.264803.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6562434752388866,
"acc_stderr": 0.03198900028362337,
"acc_norm": 0.6555271311464355,
"acc_norm_stderr": 0.0326584820786784,
"mc1": 0.5275397796817626,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.669482738361527,
"mc2_stderr": 0.01527115945822096
},
"harness|arc:challenge|25": {
"acc": 0.6996587030716723,
"acc_stderr": 0.013395909309957004,
"acc_norm": 0.7209897610921502,
"acc_norm_stderr": 0.013106784883601327
},
"harness|hellaswag|10": {
"acc": 0.7137024497112129,
"acc_stderr": 0.0045110633512787015,
"acc_norm": 0.8809002190798646,
"acc_norm_stderr": 0.003232439139881554
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.02983796238829194,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.02983796238829194
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.015555802713590172,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.015555802713590172
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.023176298203992002,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.023176298203992002
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.016553287863116037,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.016553287863116037
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658533,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658533
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5275397796817626,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.669482738361527,
"mc2_stderr": 0.01527115945822096
},
"harness|winogrande|5": {
"acc": 0.8382004735595896,
"acc_stderr": 0.010350128010292404
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954772
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_79 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1293566588.0
num_examples: 254039
download_size: 1322009453
dataset_size: 1293566588.0
---
# Dataset Card for "chunk_79"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ohicarip/deepfashion_bl2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4518429847.744
num_examples: 34032
download_size: 5304374988
dataset_size: 4518429847.744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "deepfashion_bl2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lmsys__vicuna-13b-v1.1 | ---
pretty_name: Evaluation run of lmsys/vicuna-13b-v1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lmsys/vicuna-13b-v1.1](https://huggingface.co/lmsys/vicuna-13b-v1.1) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lmsys__vicuna-13b-v1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T09:09:49.643618](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.1/blob/main/results_2023-10-16T09-09-49.643618.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029677013422818792,\n\
\ \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n\
\ \"f1_stderr\": 0.002167792401176146,\n \"acc\": 0.4141695683211732,\n\
\ \"acc_stderr\": 0.010019161585538096\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n\
\ \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \
\ \"acc_stderr\": 0.00774004433710381\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n\
\ }\n}\n```"
repo_url: https://huggingface.co/lmsys/vicuna-13b-v1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T09_09_49.643618
path:
- '**/details_harness|drop|3_2023-10-16T09-09-49.643618.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T09-09-49.643618.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T09_09_49.643618
path:
- '**/details_harness|gsm8k|5_2023-10-16T09-09-49.643618.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T09-09-49.643618.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:11:02.419209.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:11:02.419209.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-24T14:11:02.419209.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T09_09_49.643618
path:
- '**/details_harness|winogrande|5_2023-10-16T09-09-49.643618.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T09-09-49.643618.parquet'
- config_name: results
data_files:
- split: 2023_07_24T14_11_02.419209
path:
- results_2023-07-24T14:11:02.419209.parquet
- split: 2023_10_16T09_09_49.643618
path:
- results_2023-10-16T09-09-49.643618.parquet
- split: latest
path:
- results_2023-10-16T09-09-49.643618.parquet
---
# Dataset Card for Evaluation run of lmsys/vicuna-13b-v1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lmsys/vicuna-13b-v1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lmsys/vicuna-13b-v1.1](https://huggingface.co/lmsys/vicuna-13b-v1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lmsys__vicuna-13b-v1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T09:09:49.643618](https://huggingface.co/datasets/open-llm-leaderboard/details_lmsys__vicuna-13b-v1.1/blob/main/results_2023-10-16T09-09-49.643618.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146,
"acc": 0.4141695683211732,
"acc_stderr": 0.010019161585538096
},
"harness|drop|3": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.00774004433710381
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972384
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gglab-ku/turkish-plu-step-inference | ---
license: apache-2.0
---
|
skrishna/coin_flip_15_transformed | ---
dataset_info:
features:
- name: targets
dtype: string
- name: targets_vec
sequence: int64
- name: inputs
dtype: string
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 2021982
num_examples: 2000
- name: train
num_bytes: 2018958
num_examples: 2000
download_size: 1151656
dataset_size: 4040940
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
ajinkyakolhe112/pizza_vs_steak_classification | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': pizza
'1': steak
splits:
- name: train
num_bytes: 84855621.0
num_examples: 1500
- name: test
num_bytes: 28474930.0
num_examples: 500
download_size: 110558749
dataset_size: 113330551.0
---
# Dataset Card for "pizza_vs_steak_classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
boohboohdog/test | ---
license: mit
---
|
Plona/Chaoyang_FactVer1.3_v5 | ---
configs:
- config_name: default
data_files:
- split: train
path: "Claims_Covid_Train.json"
- split: test
path: "Claims_Covid_Test.json"
--- |
CyberHarem/galil_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of galil/ガリル/加利尔 (Girls' Frontline)
This is the dataset of galil/ガリル/加利尔 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `long_hair, ahoge, brown_hair, brown_eyes, blonde_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 9.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 12.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 8.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 15.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/galil_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, looking_at_viewer, simple_background, skirt, white_background, assault_rifle, holding_weapon, jacket, military_uniform, necklace, pantyhose, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | skirt | white_background | assault_rifle | holding_weapon | jacket | military_uniform | necklace | pantyhose | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------------|:--------|:-------------------|:----------------|:-----------------|:---------|:-------------------|:-----------|:------------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
michelecafagna26/hl-narratives | ---
license: apache-2.0
task_categories:
- image-to-text
- question-answering
- zero-shot-classification
language:
- en
multilinguality:
- monolingual
task_ids:
- text-scoring
pretty_name: HL-Nattatives (High-Level Narratives Dataset)
size_categories:
- 10K<n<100K
annotations_creators:
- machine-generated
dataset_info:
splits:
- name: train
num_examples: 13498
- name: test
num_examples: 1499
---
# Dataset Card for the High-Level Narratives Dataset
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
The High-Level Narratives (HL-Narratives) dataset aligns **object-centric descriptions** from [COCO](https://arxiv.org/pdf/1405.0312.pdf)
with synthetic **high-level narratives captions** automatically generated by merging **_scene_, _action_, _rationale_** captions from the [HL Dataset](https://huggingface.co/datasets/michelecafagna26/hl) using [T5](https://huggingface.co/Vamsi/T5_Paraphrase_Paws)
The HL-Naratives dataset contains 14997 images from COCO and a total of 134973 synthetic captions (3 captions per image) aligned with ~749984 object-centric captions from COCO.
**The high-level descriptions capture the human interpretations of the images**. These interpretations contain abstract concepts not directly linked to physical objects.
Each high-level description is provided with a _confidence score_, crowdsourced by an independent worker measuring the extent to which
the high-level description is likely given the corresponding image, question, and caption. The higher the score, the more the high-level caption is close to the commonsense (in a Likert scale from 1-5).
- **🗃️ Repository:** [github.com/michelecafagna26/HL-dataset](https://github.com/michelecafagna26/HL-dataset)
- **📜 Paper:** [HL Dataset: Visually-grounded Description of Scenes, Actions and Rationales](https://arxiv.org/abs/2302.12189?context=cs.CL)
[//]: # (- **🧭 Spaces:** [Dataset explorer](https://huggingface.co/spaces/michelecafagna26/High-Level-Dataset-explorer))
- **🖊️ Contact:** michele.cafagna@um.edu.mt
### Supported Tasks
- image captioning
- multimodal text-scoring
- zero-shot evaluation
### Languages
English
## Dataset Structure
The dataset is provided with images from COCO and two metadata jsonl files containing the annotations
### Data Instances
An instance looks like this:
```json
{
"file_name": "COCO_train2014_000000000036.jpg",
"captions": ["In a beach, holding an umbrella means they won't get a sunburn.",
"The lady is posing with the sun umbrella, which was taken on the beach and is enjoying and getting pictures of her vacation.",
"She is holding a parasol that is taken by a lake she is vacationing and is sunny."]
}
```
### Data Fields
- ```file_name```: original COCO filename
- ```captions```: List[str] containing 3 narrative captions for the image.
### Data Splits
There are 14997 images and 134973 high-level captions split into:
- Train-val: 13498 images and 121482 high-level captions
- Test: 1499 images and 13491 high-level captions
## Dataset Creation
The dataset has been automatically generated using T5 to merge the HL captions axis-wise.
From the paper:
> We frame the synthesis of narrative captions as a paraphrasing task. We follow a human-in-the-loop approach consisting of three stages:
> (i) we manually annotate a small sample of gold data;
> (ii) we fine-tune a large pre-trained language model (LPLM);
> (iii) we use the fine-tuned model to generate a sample of data, which is manually corrected and then
> (iv) added to the gold annotations before fine-tuning again.
### Curation Rationale
From the paper:
>We now describe how we extend the dataset to combine the three axes to compose a short `narrative', which describes the scene, action and rationale in tandem.
> To do this, we leverage the individual axes and synthesise this part of the data using a pre-trained language model.
> Since scenes, actions, and rationales were elicited individually in a visually grounded and controlled setting,
>a synthesised version of the three individual captions should also be true of the image to the same extent (modulo the variations in confidence that we observe).
### Source Data
- Images: COCO
- captions annotations: automatically generated
#### Annotation process
From the paper:
> We use a version of T5 already fine-tuned on paraphrase generation as LPLM data generator.
> We initialise the process with manually paraphrased annotations for 50 images ($3 \times 50 = 150$), fine-tune the model for 2 epochs,
> and generate 150 captions for another 50 images, which are manually corrected and added to the original 150.
> The model is then fine-tuned for a further two epochs. In each iteration, we reserve $10\%$ as validation data.
> After two epochs, we observe that the validation loss does not improve further.
> Finally, in the last iteration, we use all gold data to fine-tune the model and generate synthetic high-level captions for the whole HL dataset,
> obtaining 14,997 synthetic captions for training and 1499 for testing. In addition to the T5 paraphrase model,
> we also experimented with LLaMA in a few-shot setting; however, we find that T5 outperforms LLAMA in this task.
### Personal and Sensitive Information
There is no personal or sensitive information
## Considerations for Using the Data
[More Information Needed]
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
### Dataset Curators
Michele Cafagna
### Licensing Information
The Images follow the [COCO terms of Use](https://cocodataset.org/#termsofuse)
The remaining annotations are licensed under Apache-2.0 license.
### Citation Information
```BibTeX
@inproceedings{cafagna2023hl,
title={{HL} {D}ataset: {V}isually-grounded {D}escription of {S}cenes, {A}ctions and
{R}ationales},
author={Cafagna, Michele and van Deemter, Kees and Gatt, Albert},
booktitle={Proceedings of the 16th International Natural Language Generation Conference (INLG'23)},
address = {Prague, Czech Republic},
year={2023}
}
```
|
Kira-Floris/gov-report-qs-llama2-format | ---
license: mit
task_categories:
- question-answering
- text2text-generation
language:
- en
size_categories:
- 10K<n<100K
---
### Government Report Question Answering Dataset in LLAMA2 Format
#### Dataset Description
This dataset is a LLAMA2 formatted dataset of the [GovReport Dataset](https://gov-report-data.github.io/) which is a report dataset, consisting of reports written by government research agencies including Congressional Research Service and US Government Accountability Office.
The purpose of creating this dataset is to provide those trying to finetune LLAMA2 and other LLM models for Government domain a formatted and easier to use dataset.
- Formatted by: Floris Nzabakira
- Language: English
- License: MIT
#### Dataset Source
The dataset original source can be found in the [GovReport Dataset github.io](https://gov-report-data.github.io/)
#### Applications
The dataset is mainly used for question-answering. It can however be used for other applications like:
- Finetuning LLMS for Government domain
- Chatbots
- Text2Text Generation |
charliexu07/license_plates_2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': test
'1': train
splits:
- name: train
num_bytes: 40055469.0
num_examples: 44
download_size: 33773613
dataset_size: 40055469.0
---
# Dataset Card for "license_plates_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
fifi777/text2dv_code | ---
dataset_info:
features:
- name: id
dtype: int64
- name: content
sequence: string
- name: preprocessing_code
sequence: string
- name: visualization_code
sequence: string
- name: model
dtype: string
- name: running_info
struct:
- name: error
dtype: bool
- name: error_msg
dtype: string
- name: library
sequence: string
- name: meet_expectation
dtype: bool
- name: time
dtype: float64
- name: token
struct:
- name: completion_tokens
dtype: int64
- name: prompt_tokens
dtype: int64
- name: total_tokens
dtype: int64
- name: prompt_info
struct:
- name: dataset
dtype: string
- name: plot_type
dtype: string
- name: promote
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: python
num_bytes: 372637
num_examples: 200
download_size: 0
dataset_size: 372637
---
# Dataset Card for "text2dv_code"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
inquisitive_qg | ---
pretty_name: InquisitiveQg
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text2text-generation
task_ids: []
paperswithcode_id: inquisitive
tags:
- question-generation
dataset_info:
features:
- name: id
dtype: int32
- name: article_id
dtype: int32
- name: article
dtype: string
- name: sentence_id
dtype: int32
- name: sentence
dtype: string
- name: span
dtype: string
- name: question
dtype: string
- name: span_start_position
dtype: int32
- name: span_end_position
dtype: int32
config_name: plain_text
splits:
- name: train
num_bytes: 66099232
num_examples: 15931
- name: validation
num_bytes: 8904329
num_examples: 1991
- name: test
num_bytes: 7167203
num_examples: 1894
download_size: 7085941
dataset_size: 82170764
---
# Dataset Card for InquisitiveQg
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Add homepage URL here if available (unless it's a GitHub repository)]()
- **Repository:** [If the dataset is hosted on github or has a github homepage, add URL here]()
- **Paper:** [If the dataset was introduced by a paper or there was a paper written describing the dataset, add URL here (landing page for Arxiv paper preferred)]()
- **Leaderboard:** [If the dataset supports an active leaderboard, add link here]()
- **Point of Contact:** [If known, name and email of at least one person the reader can contact for questions about the dataset.]()
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@patil-suraj](https://github.com/patil-suraj) for adding this dataset. |
justinphan3110/sharegpt_instructions_small | ---
dataset_info:
features:
- name: instructions
dtype: string
splits:
- name: train
num_bytes: 58210
num_examples: 424
download_size: 40903
dataset_size: 58210
---
# Dataset Card for "sharegpt_instructions_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
malucoelhaofc/VERSION_HARVEST_SCOTTTENORMAN | ---
license: openrail
---
|
MobeenHameed/khan2 | ---
license: mit
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription
dtype: string
splits:
- name: train
num_bytes: 191533479.0
num_examples: 145
download_size: 181986652
dataset_size: 191533479.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha4 | ---
pretty_name: Evaluation run of MisterRid/wendigo-14b-alpha4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MisterRid/wendigo-14b-alpha4](https://huggingface.co/MisterRid/wendigo-14b-alpha4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-18T06:46:37.615025](https://huggingface.co/datasets/open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha4/blob/main/results_2023-12-18T06-46-37.615025.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5958734764422213,\n\
\ \"acc_stderr\": 0.033567613925099785,\n \"acc_norm\": 0.6017569763815189,\n\
\ \"acc_norm_stderr\": 0.034261570709298174,\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559638,\n \"mc2\": 0.5497966141695696,\n\
\ \"mc2_stderr\": 0.01557713395489198\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5571672354948806,\n \"acc_stderr\": 0.0145155738733489,\n\
\ \"acc_norm\": 0.5930034129692833,\n \"acc_norm_stderr\": 0.014356399418009121\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.599681338378809,\n\
\ \"acc_stderr\": 0.004889615413144195,\n \"acc_norm\": 0.7964548894642501,\n\
\ \"acc_norm_stderr\": 0.004018115765954247\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467383,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467383\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7064516129032258,\n\
\ \"acc_stderr\": 0.025906087021319295,\n \"acc_norm\": 0.7064516129032258,\n\
\ \"acc_norm_stderr\": 0.025906087021319295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932026,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932026\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316453,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316453\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.024433016466052462,\n\
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.024433016466052462\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.02849346509102859,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.02849346509102859\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.01678548115920363,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.01678548115920363\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422876,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422876\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.04373313040914761,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.04373313040914761\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.03731133519673893,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.03731133519673893\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n\
\ \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n\
\ \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.026189666966272035,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.026189666966272035\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.02718449890994161,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.02718449890994161\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6728395061728395,\n \"acc_stderr\": 0.026105673861409818,\n\
\ \"acc_norm\": 0.6728395061728395,\n \"acc_norm_stderr\": 0.026105673861409818\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41590612777053454,\n\
\ \"acc_stderr\": 0.012588323850313627,\n \"acc_norm\": 0.41590612777053454,\n\
\ \"acc_norm_stderr\": 0.012588323850313627\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.029029422815681393,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.029029422815681393\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777515,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777515\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n\
\ \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.397796817625459,\n\
\ \"mc1_stderr\": 0.017133934248559638,\n \"mc2\": 0.5497966141695696,\n\
\ \"mc2_stderr\": 0.01557713395489198\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7474348855564326,\n \"acc_stderr\": 0.012211148449394105\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3297952994692949,\n \
\ \"acc_stderr\": 0.01294995503057115\n }\n}\n```"
repo_url: https://huggingface.co/MisterRid/wendigo-14b-alpha4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|arc:challenge|25_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|gsm8k|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hellaswag|10_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-46-37.615025.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-18T06-46-37.615025.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- '**/details_harness|winogrande|5_2023-12-18T06-46-37.615025.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-18T06-46-37.615025.parquet'
- config_name: results
data_files:
- split: 2023_12_18T06_46_37.615025
path:
- results_2023-12-18T06-46-37.615025.parquet
- split: latest
path:
- results_2023-12-18T06-46-37.615025.parquet
---
# Dataset Card for Evaluation run of MisterRid/wendigo-14b-alpha4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MisterRid/wendigo-14b-alpha4](https://huggingface.co/MisterRid/wendigo-14b-alpha4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-18T06:46:37.615025](https://huggingface.co/datasets/open-llm-leaderboard/details_MisterRid__wendigo-14b-alpha4/blob/main/results_2023-12-18T06-46-37.615025.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5958734764422213,
"acc_stderr": 0.033567613925099785,
"acc_norm": 0.6017569763815189,
"acc_norm_stderr": 0.034261570709298174,
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559638,
"mc2": 0.5497966141695696,
"mc2_stderr": 0.01557713395489198
},
"harness|arc:challenge|25": {
"acc": 0.5571672354948806,
"acc_stderr": 0.0145155738733489,
"acc_norm": 0.5930034129692833,
"acc_norm_stderr": 0.014356399418009121
},
"harness|hellaswag|10": {
"acc": 0.599681338378809,
"acc_stderr": 0.004889615413144195,
"acc_norm": 0.7964548894642501,
"acc_norm_stderr": 0.004018115765954247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467383,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467383
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7064516129032258,
"acc_stderr": 0.025906087021319295,
"acc_norm": 0.7064516129032258,
"acc_norm_stderr": 0.025906087021319295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932026,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932026
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316453,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316453
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.024433016466052462,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.024433016466052462
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.02849346509102859,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.02849346509102859
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.01678548115920363,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.01678548115920363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422876,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422876
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.04373313040914761,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.04373313040914761
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.03731133519673893,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.03731133519673893
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560403,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.02718449890994161,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.02718449890994161
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6728395061728395,
"acc_stderr": 0.026105673861409818,
"acc_norm": 0.6728395061728395,
"acc_norm_stderr": 0.026105673861409818
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41590612777053454,
"acc_stderr": 0.012588323850313627,
"acc_norm": 0.41590612777053454,
"acc_norm_stderr": 0.012588323850313627
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.029029422815681393,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.029029422815681393
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777515,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777515
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.397796817625459,
"mc1_stderr": 0.017133934248559638,
"mc2": 0.5497966141695696,
"mc2_stderr": 0.01557713395489198
},
"harness|winogrande|5": {
"acc": 0.7474348855564326,
"acc_stderr": 0.012211148449394105
},
"harness|gsm8k|5": {
"acc": 0.3297952994692949,
"acc_stderr": 0.01294995503057115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ayang903/maple | ---
license: gpl-3.0
task_categories:
- summarization
- text-classification
language:
- en
tags:
- legal
pretty_name: Maple Bill Summarization and Tagging
size_categories:
- 100M<n<1B
configs:
- config_name: main_data
data_files: "demoapp/all_bills.csv"
---
# MAPLE (Bill Summarization, Tagging, Explanation)
In this project, we generate summaries and category tags for of Massachusetts bills for [MAPLE Platform](https://www.mapletestimony.org/). The goal is to simplify the legal language and content to make it comprehensible for a broader audience (9th-grade comprehension level) by exploring different ML and LLM services.
This repository contains a pipeline from taking bills from Massachusetts legislature, generating summaries and category tags leveraging different the Massachusetts General Law sections, creating a dashboard to display and save the generated texts, to deploying and integrating into MAPLE platform.
## Roadmap of Repository Directories
* [Documentation](https://github.com/vynpt/ml-maple-bill-summarization/tree/dev/Documentation):
```Research.md```: our research on large language models and evaluation methods we planned to use for this project.
```Documentation MAPLE.pdf```: includes detail operation of our model for future use and improvement.
* [EDA](https://github.com/vynpt/ml-maple-bill-summarization/tree/dev/EDA): the notebook ```eda.ipynb``` includes our work from scraping data that takes bills from MAPLE Swagger API, creating a dataframe to clean and process data, making visualizations to analyze data and explore characteristics of the dataset.
* [demoapp](https://github.com/vynpt/ml-maple-bill-summarization/tree/dev/demoapp):
```app.py```: contains the codes of the LLM service we used and the wepapp we made using Streamlit. The webapp allows user to search for all bills.
```app2.py```: we test on top 12 bills from MAPLE website. We extract information from [Massachusetts General Law](https://malegislature.gov/Laws/GeneralLaws) to add context for the summaries of these bills.
Other files: helper files to be imported in the above two Python app files.
* [Prompts Engineering](https://github.com/vynpt/ml-maple-bill-summarization/tree/dev/Prompts%20Engineering): ```prompts.md``` stores all prompts that we tested.
* [Tagging](https://github.com/vynpt/ml-maple-bill-summarization/tree/dev/Tagging): contains the list of categories and tags.
* [Deployment](https://github.com/vynpt/ml-maple-bill-summarization/tree/main/Deployment): contains the link of our Streamlit deployed webapp.
## Ethical Implications
The dataset used for this project is fully open sourced and can be access through Mass General Laws API.
Our team and MAPLE agree about putting disclaimer that this text is AI-generated.
Although we make use of open source transformers to evaluate hallucination with Vectara, it is important to have experts and human evaluation to further maintain a trustworthy LLM system.
## Resources and Citation
* https://huggingface.co/docs/transformers/tasks/summarization
* https://huggingface.co/vectara/hallucination_evaluation_model
* https://github.com/vectara/hallucination-leaderboard
* https://www.nocode.ai/llms-undesirable-outputs/
* https://learn.deeplearning.ai/
* https://blog.langchain.dev/espilla-x-langchain-retrieval-augmented-generation-rag-in-llm-powered-question-answering-pipelines/
## Team Members
Vy Nguyen - Email: nptv1207@bu.edu
Andy Yang - Email: ayang903@bu.edu
Gauri Bhandarwar - Email: gaurib3@bu.edu
Weining Mai - Email: weimai@bu.edu |
xwar/2023-11-19_ninox_dataset_single_column | ---
license: apache-2.0
---
|
RuyuanWan/SBIC_Disagreement | ---
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- found
license: []
multilinguality:
- monolingual
pretty_name: RuyuanWan/SBIC_Disagreement
size_categories: []
source_datasets:
- extended|social_bias_frames
tags: []
task_categories:
- text-classification
task_ids: []
---
This dataset is processed version of Social Bias Inference Corpus(SBIC) dataset including text, annotator's demographics and the annotation disagreement labels. <br>
Paper: Everyone's Voice Matters: Quantifying Annotation Disagreement Using Demographic Information <br>
Authors: Ruyuan Wan, Jaehyung Kim, Dongyeop Kang <br>
Github repo: https://github.com/minnesotanlp/Quantifying-Annotation-Disagreement <br>
|
autoevaluate/autoeval-eval-futin__guess-vi-d44dbe-2087167151 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-3b
metrics: []
dataset_name: futin/guess
dataset_config: vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-3b
* Dataset: futin/guess
* Config: vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
misza222/OwczarekPodhalanski-dog-lr1e-06-max_train_steps800-results | ---
dataset_info:
features:
- name: images
dtype: image
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 5281596.0
num_examples: 12
download_size: 5282716
dataset_size: 5281596.0
---
# Dataset Card for "OwczarekPodhalanski-dog-lr1e-06-max_train_steps1200-results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_macadeliccc__MonarchLake-7B | ---
pretty_name: Evaluation run of macadeliccc/MonarchLake-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [macadeliccc/MonarchLake-7B](https://huggingface.co/macadeliccc/MonarchLake-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__MonarchLake-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T14:34:21.929064](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__MonarchLake-7B/blob/main/results_2024-02-22T14-34-21.929064.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6510325412646052,\n\
\ \"acc_stderr\": 0.03209871099664823,\n \"acc_norm\": 0.6502226106662531,\n\
\ \"acc_norm_stderr\": 0.03277341367781569,\n \"mc1\": 0.6132190942472461,\n\
\ \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7497415375798714,\n\
\ \"mc2_stderr\": 0.014308422950656522\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7218430034129693,\n \"acc_stderr\": 0.0130944699195388,\n\
\ \"acc_norm\": 0.7414675767918089,\n \"acc_norm_stderr\": 0.012794553754288687\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.724457279426409,\n\
\ \"acc_stderr\": 0.004458742356237875,\n \"acc_norm\": 0.892850029874527,\n\
\ \"acc_norm_stderr\": 0.0030867169185536053\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903341,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903341\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5106382978723404,\n \"acc_stderr\": 0.02982074719142244,\n \
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.02982074719142244\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6132190942472461,\n\
\ \"mc1_stderr\": 0.017048857010515103,\n \"mc2\": 0.7497415375798714,\n\
\ \"mc2_stderr\": 0.014308422950656522\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8547750591949487,\n \"acc_stderr\": 0.009902153904760826\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6830932524639879,\n \
\ \"acc_stderr\": 0.012815868296721364\n }\n}\n```"
repo_url: https://huggingface.co/macadeliccc/MonarchLake-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|arc:challenge|25_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|gsm8k|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hellaswag|10_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-34-21.929064.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T14-34-21.929064.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- '**/details_harness|winogrande|5_2024-02-22T14-34-21.929064.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T14-34-21.929064.parquet'
- config_name: results
data_files:
- split: 2024_02_22T14_34_21.929064
path:
- results_2024-02-22T14-34-21.929064.parquet
- split: latest
path:
- results_2024-02-22T14-34-21.929064.parquet
---
# Dataset Card for Evaluation run of macadeliccc/MonarchLake-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/MonarchLake-7B](https://huggingface.co/macadeliccc/MonarchLake-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__MonarchLake-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T14:34:21.929064](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__MonarchLake-7B/blob/main/results_2024-02-22T14-34-21.929064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6510325412646052,
"acc_stderr": 0.03209871099664823,
"acc_norm": 0.6502226106662531,
"acc_norm_stderr": 0.03277341367781569,
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7497415375798714,
"mc2_stderr": 0.014308422950656522
},
"harness|arc:challenge|25": {
"acc": 0.7218430034129693,
"acc_stderr": 0.0130944699195388,
"acc_norm": 0.7414675767918089,
"acc_norm_stderr": 0.012794553754288687
},
"harness|hellaswag|10": {
"acc": 0.724457279426409,
"acc_stderr": 0.004458742356237875,
"acc_norm": 0.892850029874527,
"acc_norm_stderr": 0.0030867169185536053
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903341,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903341
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.02982074719142244,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.02982074719142244
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6132190942472461,
"mc1_stderr": 0.017048857010515103,
"mc2": 0.7497415375798714,
"mc2_stderr": 0.014308422950656522
},
"harness|winogrande|5": {
"acc": 0.8547750591949487,
"acc_stderr": 0.009902153904760826
},
"harness|gsm8k|5": {
"acc": 0.6830932524639879,
"acc_stderr": 0.012815868296721364
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
fewshot-goes-multilingual/cs_czech-named-entity-corpus_2.0 | ---
annotations_creators:
- expert-generated
language:
- cs
language_creators:
- found
license:
- cc-by-nc-sa-3.0
multilinguality:
- monolingual
pretty_name: Czech Named Entity Corpus 2.0
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- czech NER
- CNEC
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset Card for Czech Named Entity Corpus 2.0
## Dataset Description
The dataset contains Czech sentences and annotated named entities. Total number of sentences is around 9,000 and total number of entities is around 34,000. (Total means train + validation + test)
## Dataset Features
Each sample contains:
- `text`: source sentence
- `entities`: list of selected entities. Each entity contains:
- `category_id`: string identifier of the entity category
- `category_str`: human-friendly category name in Czech (verbalizer)
- `start`: index on which the entity starts in the source sentence
- `end`: index on which the entity ends in the source sentence
- `content`: entity content, it was created as `text[start:end]`
- `entity_id`: unique entity string identifier
- `parent_id`: If entity was selected inside another entity (e.g. house number inside address), `parent_id` is the identifier of the parent entity. None otherwise.
The `entity_id` field was checked to be globally unique (across data samples and dataset splits.)
## Entity categories
The list of the recognized entities (`category_id`, `category_str` pairs):
```python3
{
'A': 'číslo v adrese / kontaktním údaji',
'ah': 'číslo domu',
'at': 'telefonní číslo / fax',
'az': 'PSČ (poštovní směrovací číslo)',
'C': 'reference/bibliografie',
'f': 'cizí výraz',
'g_': 'geografický název - jiný',
'gc': 'stát/země',
'gh': 'jméno vodstva',
'gl': 'přírodní oblast/útvar',
'gq': 'městská čtvrť',
'gr': 'území',
'gs': 'ulice/náměstí',
'gt': 'kontinent',
'gu': 'město/zámek',
'i_': 'instituce - jiná',
'ia': 'konference/soutěž',
'ic': 'kulturní/vzdělávací/vědecká instituce',
'if': 'komerční instituce',
'io': 'vládní/politická instituce',
'me': 'emailová adresa',
'mi': 'URL / internetový odkaz',
'mn': 'časopis',
'ms': 'radio/televizní stanice',
'n_': 'číselný výraz - jiný',
'na': 'věk',
'nb': 'číslo stránky/kapitoly/sekce/objektu',
'nc': 'množství/počet',
'ni': 'číslo položky',
'no': 'pořadí',
'ns': 'sportovní skóre',
'o_': 'artefakt - jiný',
'oa': 'umělecké dílo / kulturní artefakt',
'oe': 'jednotka',
'om': 'měna',
'op': 'produkt/výrobek',
'or': 'zákon/směrnice/listina',
'P': 'celé jméno',
'p_': 'jméno - jiné',
'pc': 'národnost',
'pd': '(akademický) titul',
'pf': 'křestní jméno',
'pm': 'prostřední jméno',
'pp': 'mýtická/historická postava',
'ps': 'příjmení',
's': 'zkratka',
'T': 'čas/datum',
'td': 'den',
'tf': 'svátky',
'th': 'hodiny/minuty',
'tm': 'měsíc',
'ty': 'rok',
}
```
## Dataset Source
The dataset is a preprocessed adaptation of existing CNEC 2.0 dataset [project info](https://ufal.mff.cuni.cz/cnec/cnec2.0), [link to data](https://lindat.mff.cuni.cz/repository/xmlui/handle/11858/00-097C-0000-0023-1B22-8). This adaptation contains (almost) same data, but converted to a convenient format. In addition, we inspected and decided to remove entity categories: `?`, `segm`, `cap`, `lower`, `upper`, which were either undocumented and/or carried little semantic meaning.
The category names (verbalizers) are not in the original dataset. They were added by a Czech native speaker using the available [documentation](https://ufal.mff.cuni.cz/cnec/cnec2.0) and by looking at several occurrences in the data.
## Citation
Cite authors of the [original dataset](https://lindat.mff.cuni.cz/repository/xmlui/handle/11858/00-097C-0000-0023-1B22-8):
```bibtex
@misc{11858/00-097C-0000-0023-1B22-8,
title = {Czech Named Entity Corpus 2.0},
author = {{\v S}ev{\v c}{\'{\i}}kov{\'a}, Magda and {\v Z}abokrtsk{\'y}, Zden{\v e}k and Strakov{\'a}, Jana and Straka, Milan},
url = {http://hdl.handle.net/11858/00-097C-0000-0023-1B22-8},
note = {{LINDAT}/{CLARIAH}-{CZ} digital library at the Institute of Formal and Applied Linguistics ({{\'U}FAL}), Faculty of Mathematics and Physics, Charles University},
copyright = {Attribution-{NonCommercial}-{ShareAlike} 3.0 Unported ({CC} {BY}-{NC}-{SA} 3.0)},
year = {2014}
}
```
|
magicmachine/wizzypedia | ---
license: cc-by-nc-3.0
language:
- en
tags:
- art
pretty_name: Wizzypedia
size_categories:
- 1K<n<10K
---
# Wizzypedia
These datasets are created from the Forgotten Runes Wizard's Cult Wizzypedia.
You can find the [Wizzypedia here](http://wizzypedia.forgottenrunes.com/).
Guide to the datasets:
* `tokenized-wizzypedia-400.jsonl` - 400 token chunks encoded with tiktoken `cl100k_base` encoding |
atgarcia/EMGSoundTest | ---
dataset_info:
features:
- name: text
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: emg
sequence:
sequence: float64
- name: emg_sound
sequence: float64
splits:
- name: test
num_bytes: 2551479378
num_examples: 1075
download_size: 1587695729
dataset_size: 2551479378
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
icpython/Spotter_Docs | ---
license: unknown
---
|
bigbio/sciq |
---
language:
- en
bigbio_language:
- English
license: cc-by-nc-3.0
multilinguality: monolingual
bigbio_license_shortname: CC_BY_NC_3p0
pretty_name: SciQ
homepage: https://allenai.org/data/sciq
bigbio_pubmed: False
bigbio_public: True
bigbio_tasks:
- QUESTION_ANSWERING
---
# Dataset Card for SciQ
## Dataset Description
- **Homepage:** https://allenai.org/data/sciq
- **Pubmed:** False
- **Public:** True
- **Tasks:** QA
The SciQ dataset contains 13,679 crowdsourced science exam questions about Physics, Chemistry and Biology, among others. The questions are in multiple-choice format with 4 answer options each. For most questions, an additional paragraph with supporting evidence for the correct answer is provided.
## Citation Information
```
@inproceedings{welbl-etal-2017-crowdsourcing,
title = "Crowdsourcing Multiple Choice Science Questions",
author = "Welbl, Johannes and
Liu, Nelson F. and
Gardner, Matt",
booktitle = "Proceedings of the 3rd Workshop on Noisy User-generated Text",
month = sep,
year = "2017",
address = "Copenhagen, Denmark",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/W17-4413",
doi = "10.18653/v1/W17-4413",
pages = "94--106",
}
```
|
RoshanAdhithya/autotrain-data-finalbartmodel | ---
task_categories:
- summarization
---
# AutoTrain Dataset for project: finalbartmodel
## Dataset Description
This dataset has been automatically processed by AutoTrain for project finalbartmodel.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"text": "Four people standing in an enclosure with a sign that says \" The Human Shop \" on it . The Human Shop",
"feat_Unnamed: 1": null,
"target": "Four people standing in an enclosure with a sign that says \" The Human Shop \" on it . ",
"feat_Unnamed: 3": null,
"feat_Unnamed: 4": null,
"feat_Unnamed: 5": null,
"feat_Unnamed: 6": null,
"feat_Unnamed: 7": null,
"feat_Unnamed: 8": null,
"feat_Unnamed: 9": null,
"feat_Unnamed: 10": null,
"feat_Unnamed: 11": null,
"feat_Unnamed: 12": null,
"feat_Unnamed: 13": null,
"feat_Unnamed: 14": null,
"feat_Unnamed: 15": null
},
{
"text": "a man carrying a sign that says free hug along the sidewalk .Free hugs",
"feat_Unnamed: 1": null,
"target": "a man carrying a sign that says free hug along the sidewalk .",
"feat_Unnamed: 3": null,
"feat_Unnamed: 4": null,
"feat_Unnamed: 5": null,
"feat_Unnamed: 6": null,
"feat_Unnamed: 7": null,
"feat_Unnamed: 8": null,
"feat_Unnamed: 9": null,
"feat_Unnamed: 10": null,
"feat_Unnamed: 11": null,
"feat_Unnamed: 12": null,
"feat_Unnamed: 13": null,
"feat_Unnamed: 14": null,
"feat_Unnamed: 15": null
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"text": "Value(dtype='string', id=None)",
"feat_Unnamed: 1": "Value(dtype='float64', id=None)",
"target": "Value(dtype='string', id=None)",
"feat_Unnamed: 3": "Value(dtype='float64', id=None)",
"feat_Unnamed: 4": "Value(dtype='float64', id=None)",
"feat_Unnamed: 5": "Value(dtype='float64', id=None)",
"feat_Unnamed: 6": "Value(dtype='float64', id=None)",
"feat_Unnamed: 7": "Value(dtype='float64', id=None)",
"feat_Unnamed: 8": "Value(dtype='float64', id=None)",
"feat_Unnamed: 9": "Value(dtype='float64', id=None)",
"feat_Unnamed: 10": "Value(dtype='float64', id=None)",
"feat_Unnamed: 11": "Value(dtype='float64', id=None)",
"feat_Unnamed: 12": "Value(dtype='float64', id=None)",
"feat_Unnamed: 13": "Value(dtype='float64', id=None)",
"feat_Unnamed: 14": "Value(dtype='float64', id=None)",
"feat_Unnamed: 15": "Value(dtype='float64', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 408 |
| valid | 102 |
|
ProCreations/Test | ---
license: apache-2.0
---
|
CyberHarem/yamashiro_takane_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yamashiro_takane (Touhou)
This is the dataset of yamashiro_takane (Touhou), containing 254 images and their tags.
The core tags of this character are `green_hair, hat, green_eyes, flat_cap, medium_hair, bangs, green_headwear, camouflage_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 254 | 299.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 254 | 186.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 578 | 381.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 254 | 273.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 578 | 514.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yamashiro_takane_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yamashiro_takane_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, solo, boots, green_shirt, key, simple_background, white_background, brown_footwear, frills, full_body, green_skirt, camouflage_jacket, long_sleeves, smile, holding_card, pocket, standing, looking_at_viewer, open_mouth, backpack, blue_headwear, box |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | boots | green_shirt | key | simple_background | white_background | brown_footwear | frills | full_body | green_skirt | camouflage_jacket | long_sleeves | smile | holding_card | pocket | standing | looking_at_viewer | open_mouth | backpack | blue_headwear | box |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------|:------|:--------------------|:-------------------|:-----------------|:---------|:------------|:--------------|:--------------------|:---------------|:--------|:---------------|:---------|:-----------|:--------------------|:-------------|:-----------|:----------------|:------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_MaziyarPanahi__M7Yamshadowexperiment28_Strangemerges_30Experiment26 | ---
pretty_name: Evaluation run of MaziyarPanahi/M7Yamshadowexperiment28_Strangemerges_30Experiment26
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/M7Yamshadowexperiment28_Strangemerges_30Experiment26](https://huggingface.co/MaziyarPanahi/M7Yamshadowexperiment28_Strangemerges_30Experiment26)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__M7Yamshadowexperiment28_Strangemerges_30Experiment26\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T10:28:00.994181](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__M7Yamshadowexperiment28_Strangemerges_30Experiment26/blob/main/results_2024-04-09T10-28-00.994181.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508821789914124,\n\
\ \"acc_stderr\": 0.03207251204949206,\n \"acc_norm\": 0.650057066127438,\n\
\ \"acc_norm_stderr\": 0.03274572904790381,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7813193022414375,\n\
\ \"mc2_stderr\": 0.013666530160211392\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838795,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523198\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n\
\ \"acc_stderr\": 0.004494454911844619,\n \"acc_norm\": 0.8916550487950607,\n\
\ \"acc_norm_stderr\": 0.003101803574556311\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903343,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903343\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015057,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015057\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7813193022414375,\n\
\ \"mc2_stderr\": 0.013666530160211392\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571776\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515425\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/M7Yamshadowexperiment28_Strangemerges_30Experiment26
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-00.994181.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-28-00.994181.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- '**/details_harness|winogrande|5_2024-04-09T10-28-00.994181.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T10-28-00.994181.parquet'
- config_name: results
data_files:
- split: 2024_04_09T10_28_00.994181
path:
- results_2024-04-09T10-28-00.994181.parquet
- split: latest
path:
- results_2024-04-09T10-28-00.994181.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/M7Yamshadowexperiment28_Strangemerges_30Experiment26
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/M7Yamshadowexperiment28_Strangemerges_30Experiment26](https://huggingface.co/MaziyarPanahi/M7Yamshadowexperiment28_Strangemerges_30Experiment26) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__M7Yamshadowexperiment28_Strangemerges_30Experiment26",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T10:28:00.994181](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__M7Yamshadowexperiment28_Strangemerges_30Experiment26/blob/main/results_2024-04-09T10-28-00.994181.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6508821789914124,
"acc_stderr": 0.03207251204949206,
"acc_norm": 0.650057066127438,
"acc_norm_stderr": 0.03274572904790381,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7813193022414375,
"mc2_stderr": 0.013666530160211392
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838795,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523198
},
"harness|hellaswag|10": {
"acc": 0.7171878111929895,
"acc_stderr": 0.004494454911844619,
"acc_norm": 0.8916550487950607,
"acc_norm_stderr": 0.003101803574556311
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903343,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903343
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015057,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7813193022414375,
"mc2_stderr": 0.013666530160211392
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571776
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515425
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Jirui/testing | ---
license: afl-3.0
---
|
Venkatesh26/Salesforce_flow_xml | ---
license: apache-2.0
---
|
carnival13/hotpot_FiD | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: supporting_facts
sequence:
- name: title
dtype: string
- name: sent_id
dtype: int32
- name: context
sequence:
- name: title
dtype: string
- name: sentences
sequence: string
- name: para_list
sequence: string
- name: output
dtype: string
- name: gold_para
sequence: int64
- name: act_idxs
sequence: int64
- name: input10
sequence: string
splits:
- name: train
num_bytes: 1749489303
num_examples: 90447
- name: validation
num_bytes: 143793645
num_examples: 7405
download_size: 1048534101
dataset_size: 1893282948
---
# Dataset Card for "hotpot_FiD"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MartinDx/first | ---
license: mit
---
|
usvsnsp/pile-pythia-code-vs-nl-scores | ---
dataset_info:
features:
- name: sequence_id
dtype: int64
- name: nl_scores
dtype: float32
splits:
- name: standard
num_bytes: 1757184000
num_examples: 146432000
- name: deduped
num_bytes: 1757184000
num_examples: 146432000
download_size: 2547384724
dataset_size: 3514368000
configs:
- config_name: default
data_files:
- split: standard
path: data/standard-*
- split: deduped
path: data/deduped-*
---
|
Pm06/my-image-label-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: vision_info
dtype: string
splits:
- name: train
num_bytes: 247252517.0
num_examples: 1000
download_size: 246904988
dataset_size: 247252517.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
J3H0X77K/CHAMOX | ---
license: afl-3.0
---
|
AayushShah/SQL_SparC_Dataset_With_Schema | ---
dataset_info:
features:
- name: database_id
dtype: string
- name: query
dtype: string
- name: question
dtype: string
- name: metadata
dtype: string
splits:
- name: train
num_bytes: 3249206
num_examples: 3456
download_size: 288326
dataset_size: 3249206
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SQL_SparC_Dataset_With_Schema"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/GPTeacher_roleplay_standardized_cluster_1_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 739269
num_examples: 911
download_size: 452181
dataset_size: 739269
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPTeacher_roleplay_standardized_cluster_1_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_emphatic_reflex | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1105
num_examples: 18
- name: test
num_bytes: 1348
num_examples: 19
- name: train
num_bytes: 5069
num_examples: 71
download_size: 9283
dataset_size: 7522
---
# Dataset Card for "MULTI_VALUE_cola_emphatic_reflex"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rksys/EYE_DISEASE_CLASSIFICATION | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': cataract
'1': diabetic_retinopathy
'2': glaucoma
'3': normal
splits:
- name: train
num_bytes: 705412289.905
num_examples: 3795
- name: test
num_bytes: 66862462.0
num_examples: 422
download_size: 772276059
dataset_size: 772274751.905
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
centaurus-alpha/Hunyuan-DialogBen | ---
license: mit
---
|
tilyupo/squad_cqa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 74256565
num_examples: 87599
- name: validation
num_bytes: 9215052
num_examples: 10570
download_size: 14907663
dataset_size: 83471617
---
# Dataset Card for "squad_cqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Multimodal-Fatima/OxfordPets_test_facebook_opt_125m_Attributes_ns_3669 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: image
dtype: image
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
- name: scores
sequence: float64
splits:
- name: fewshot_0_bs_16
num_bytes: 121000492.375
num_examples: 3669
- name: fewshot_1_bs_16
num_bytes: 121909173.375
num_examples: 3669
- name: fewshot_3_bs_16
num_bytes: 123709349.375
num_examples: 3669
- name: fewshot_5_bs_16
num_bytes: 125501892.375
num_examples: 3669
- name: fewshot_8_bs_16
num_bytes: 128203231.375
num_examples: 3669
download_size: 602523943
dataset_size: 620324138.875
---
# Dataset Card for "OxfordPets_test_facebook_opt_125m_Attributes_ns_3669"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
4lchemistX/miadataset | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 815439
num_examples: 518
- name: train
num_bytes: 1783300
num_examples: 1100
download_size: 12444120
dataset_size: 2598739
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
reflecticai/data500 | ---
license: apache-2.0
---
|
tyzhu/fw_baseline_squad_train_100_eval_100 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval_find_word
path: data/eval_find_word-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 35302
num_examples: 100
- name: eval_find_word
num_bytes: 35307
num_examples: 100
- name: validation
num_bytes: 35307
num_examples: 100
download_size: 77242
dataset_size: 105916
---
# Dataset Card for "fw_baseline_squad_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/am_rfb_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of am_rfb/AmRFB/RFB (Girls' Frontline)
This is the dataset of am_rfb/AmRFB/RFB (Girls' Frontline), containing 338 images and their tags.
The core tags of this character are `long_hair, green_eyes, brown_hair, bangs, hair_bun, bow, double_bun, hair_bow, breasts, medium_breasts, ahoge, hair_between_eyes, green_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 338 | 491.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_rfb_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 338 | 276.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_rfb_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 838 | 608.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_rfb_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 338 | 437.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_rfb_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 838 | 857.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/am_rfb_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/am_rfb_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, black_dress, black_footwear, blush, bullpup, camouflage_jacket, choker, dog_tags, fur_trim, looking_at_viewer, smile, solo, black_gloves, collarbone, fingerless_gloves, full_body, assault_rifle, white_background, mary_janes, off_shoulder, striped_socks, vertical_stripes, ankle_cuffs, asymmetrical_legwear, covered_navel, holding_gun, simple_background, sketch, teddy_bear, trigger_discipline |
| 1 | 10 |  |  |  |  |  | 1girl, assault_rifle, bullpup, camouflage_jacket, choker, collarbone, fur_trim, looking_at_viewer, solo, black_dress, blush, holding_gun, smile, cleavage, dog_tags, trigger_discipline, black_gloves, fingerless_gloves, character_name, socks, striped |
| 2 | 11 |  |  |  |  |  | 1girl, black_dress, blush, camouflage_jacket, collarbone, looking_at_viewer, smile, solo, choker, bare_shoulders, cleavage, dog_tags, fur-trimmed_jacket, off_shoulder, closed_mouth, white_background, necklace |
| 3 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, collarbone, fur-trimmed_jacket, looking_at_viewer, off_shoulder, solo, black_choker, cleavage, closed_mouth, simple_background, smile, upper_body, camouflage_jacket, white_background, black_dress, dog_tags, open_jacket |
| 4 | 5 |  |  |  |  |  | bare_shoulders, black_dress, black_gloves, blush, camouflage_jacket, choker, collarbone, dog_tags, fingerless_gloves, socks, white_background, 1girl, :d, asymmetrical_legwear, black_footwear, holding_handheld_game_console, looking_at_viewer, nintendo_switch, off_shoulder, open_mouth, solo, fur-trimmed_jacket, joy-con, knees_up, mary_janes, simple_background, sitting, vertical_stripes, character_name, convenient_leg, full_body, small_breasts, standing |
| 5 | 39 |  |  |  |  |  | 1girl, solo, bare_shoulders, blush, looking_at_viewer, official_alternate_costume, red_dress, red_bow, smile, cleavage, christmas, collarbone, choker, black_pantyhose, open_coat, open_mouth, belt, holding, strapless_dress, red_footwear, duffel_coat, nintendo_switch, sidelocks, boots, closed_mouth, fur-trimmed_dress, off_shoulder |
| 6 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, outdoors, solo, blush, smile, cloud, day, blue_sky, dress, open_mouth, collarbone, black_bikini, cleavage, frills, navel |
| 7 | 14 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, floral_print, wide_sleeves, smile, long_sleeves, red_kimono, hakama_skirt, official_alternate_costume, purple_hakama, holding, obi, open_mouth, red_bow, closed_mouth, print_kimono |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_dress | black_footwear | blush | bullpup | camouflage_jacket | choker | dog_tags | fur_trim | looking_at_viewer | smile | solo | black_gloves | collarbone | fingerless_gloves | full_body | assault_rifle | white_background | mary_janes | off_shoulder | striped_socks | vertical_stripes | ankle_cuffs | asymmetrical_legwear | covered_navel | holding_gun | simple_background | sketch | teddy_bear | trigger_discipline | cleavage | character_name | socks | striped | bare_shoulders | fur-trimmed_jacket | closed_mouth | necklace | black_choker | upper_body | open_jacket | :d | holding_handheld_game_console | nintendo_switch | open_mouth | joy-con | knees_up | sitting | convenient_leg | small_breasts | standing | official_alternate_costume | red_dress | red_bow | christmas | black_pantyhose | open_coat | belt | holding | strapless_dress | red_footwear | duffel_coat | sidelocks | boots | fur-trimmed_dress | outdoors | cloud | day | blue_sky | dress | black_bikini | frills | navel | floral_print | wide_sleeves | long_sleeves | red_kimono | hakama_skirt | purple_hakama | obi | print_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-----------------|:--------|:----------|:--------------------|:---------|:-----------|:-----------|:--------------------|:--------|:-------|:---------------|:-------------|:--------------------|:------------|:----------------|:-------------------|:-------------|:---------------|:----------------|:-------------------|:--------------|:-----------------------|:----------------|:--------------|:--------------------|:---------|:-------------|:---------------------|:-----------|:-----------------|:--------|:----------|:-----------------|:---------------------|:---------------|:-----------|:---------------|:-------------|:--------------|:-----|:--------------------------------|:------------------|:-------------|:----------|:-----------|:----------|:-----------------|:----------------|:-----------|:-----------------------------|:------------|:----------|:------------|:------------------|:------------|:-------|:----------|:------------------|:---------------|:--------------|:------------|:--------|:--------------------|:-----------|:--------|:------|:-----------|:--------|:---------------|:---------|:--------|:---------------|:---------------|:---------------|:-------------|:---------------|:----------------|:------|:---------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | X | | | | | | | | | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 11 |  |  |  |  |  | X | X | | X | | X | X | X | | X | X | X | | X | | | | X | | X | | | | | | | | | | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | X | | X | | X | | X | | X | X | X | | X | | | | X | | X | | | | | | | X | | | | X | | | | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | | X | X | X | | X | | X | X | X | X | X | | X | X | X | | X | | X | | | X | | | | | X | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 39 |  |  |  |  |  | X | | | X | | | X | | | X | X | X | | X | | | | | | X | | | | | | | | | | | X | | | | X | | X | | | | | | | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | | | X | | | | | | X | X | X | | X | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | |
| 7 | 14 |  |  |  |  |  | X | | | X | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | X | | X | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X |
|
victor/autotrain-data-image-classification-test-18 | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: image-classification-test-18
## Dataset Descritpion
This dataset has been automatically processed by AutoTrain for project image-classification-test-18.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<224x224 RGB PIL image>",
"target": 2
},
{
"image": "<224x224 RGB PIL image>",
"target": 2
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(num_classes=3, names=['ADONIS', 'AFRICAN GIANT SWALLOWTAIL', 'AMERICAN SNOOT'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 269 |
| valid | 69 |
|
Cheetor1996/Serena_aku_no_onna_kanbu | ---
license: cc-by-2.0
language:
- en
tags:
- art
pretty_name: Serena (Aku no onna kanbu)
---
**Serena** from Aku **no onna kanbu**
* *Trained with Anime (final-full-pruned) model.*
* *3 versions; 6 epochs for less restriction to the original art style and activation tags, and 9 & 10 epochs for a closer accuracy to the character.*
* *Recommended LoRA weights; 0.8-1*
* *Works best with ALL, MIDD, OUTD, and OUTALL LoRA weight blocks.*
* *Activation tags: **serena (aku no onna kanbu)** to get the character's traits, and **serena_ova_bikini** to the character's bikini outfit used briefly at the beach OVA.*
* *Use **serena (aku no onna kanbu)** alongside short hair and yellow eyes to get the character as much accurate as possible.*
* *Use **serena_ova_bikini** like this "[serena_ova_bikini:(short shorts, boyshorts:1.2), (strapless bikini:1.25):1.1]" or "[serena_ova_bikini:(short shorts, boyshorts:1.2):1.0]" with (strapless bikini:1.25) to get the character as much accurate as possible.*
* *To get other outfits that aren't accessed with serena_ova_bikini, you can try any of the following:*
* *Place **(yellow bikini:1.2), (strapless:1.2), (strapless bikini:1.2)** in the "Negative Prompt" box.*
* *Increase the weights of tags used for the desired outfit. ex; (blue shirt:1.2), (red skirt:1.2)*
* *Use the OUTALL LoRA weight blocks for this LoRA.*
* *Use the Inpainting functions to correct any "mistake" in the images, or to draw a piece of clothing over the initial image.*
|
DianaJin/winter | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 31704632
num_examples: 33
- name: test
num_bytes: 4802920
num_examples: 5
- name: valid
num_bytes: 3842872
num_examples: 4
download_size: 13977535
dataset_size: 40350424
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
NobodyExistsOnTheInternet/toxicqa | ---
license: mit
tags:
- not-for-all-audiences
---
Full, 8K long ToxicQA. Unprocessed. Suggested not to be used as it is.
Use only for Alignment research. NOETI is not responsible for what you might do with it. |
ersdd/footballpostures | ---
license: other
license_name: other
license_link: LICENSE
---
|
Falah/Fibonacci_Golden_Ratio_Style_Prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 2093547587
num_examples: 4000000
download_size: 295669145
dataset_size: 2093547587
---
# Dataset Card Fibonacci Golden Ratio Style Prompts Dataset
## Description
The Fibonacci Golden Ratio Style Prompts Dataset is a collection of prompts designed to inspire artists in incorporating Fibonacci's
golden ratio into their art. The dataset provides a set of prompts in string format, which artists can use as creative starting points for
their artworks. The golden ratio is known for its aesthetic appeal and has been used by artists,
architects, and designers throughout history to achieve harmonious proportions in their creations.
## Features:
- `prompts`: A string feature containing artistic prompts.
## Dataset Splits:
- `train`: This split contains 4,000,000 examples.
## Dataset Size:
- Total size on disk: 2,093,547,587 bytes
## Download Size:
- The complete dataset can be downloaded as a single file with a size of 295,669,145 bytes.
|
andrewkroening/538-NBA-Historical-Raptor | ---
license: cc
---
## Dataset Overview
### Intro
This dataset was downloaded from the good folks at fivethirtyeight. You can find the original (or in the future, updated) versions of this and several similar datasets at [this GitHub link.](https://github.com/fivethirtyeight/data/tree/master/nba-raptor)
### Data layout
Here are the columns in this dataset, which contains data on every NBA player, broken out by season, since the 1976 NBA-ABA merger:
Column | Description
-------|---------------
`player_name` | Player name
`player_id` | Basketball-Reference.com player ID
`season` | Season
`season_type` | Regular season (RS) or playoff (PO)
`team` | Basketball-Reference ID of team
`poss` | Possessions played
`mp` | Minutes played
`raptor_box_offense` | Points above average per 100 possessions added by player on offense, based only on box score estimate
`raptor_box_defense` | Points above average per 100 possessions added by player on defense, based only on box score estimate
`raptor_box_total` | Points above average per 100 possessions added by player, based only on box score estimate
`raptor_onoff_offense` | Points above average per 100 possessions added by player on offense, based only on plus-minus data
`raptor_onoff_defense` | Points above average per 100 possessions added by player on defense, based only on plus-minus data
`raptor_onoff_total` | Points above average per 100 possessions added by player, based only on plus-minus data
`raptor_offense` | Points above average per 100 possessions added by player on offense, using both box and on-off components
`raptor_defense` | Points above average per 100 possessions added by player on defense, using both box and on-off components
`raptor_total` | Points above average per 100 possessions added by player on both offense and defense, using both box and on-off components
`war_total` | Wins Above Replacement between regular season and playoffs
`war_reg_season` | Wins Above Replacement for regular season
`war_playoffs` | Wins Above Replacement for playoffs
`predator_offense` | Predictive points above average per 100 possessions added by player on offense
`predator_defense` | Predictive points above average per 100 possessions added by player on defense
`predator_total` | Predictive points above average per 100 possessions added by player on both offense and defense
`pace_impact` | Player impact on team possessions per 48 minutes
### More information
This dataset was put together for Hugging Face by this guy: [Andrew Kroening](https://github.com/andrewkroening)
He was building some kind of a silly tool using this dataset. It's an NBA WAR Predictor tool, and you can find the Gradio interface [here.](https://huggingface.co/spaces/andrewkroening/nba-war-predictor) The GitHub repo can be found [here.](https://github.com/andrewkroening/nba-war-predictor-tool) |
liuyanchen1015/MULTI_VALUE_stsb_present_for_exp_perfect | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 15931
num_examples: 78
- name: test
num_bytes: 8785
num_examples: 43
- name: train
num_bytes: 41274
num_examples: 174
download_size: 54440
dataset_size: 65990
---
# Dataset Card for "MULTI_VALUE_stsb_present_for_exp_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Universal-NER/Pile-NER-type | ---
language:
- en
size_categories:
- 10K<n<100K
---
# Intro
Pile-NER-type is a set of GPT-generated data for named entity recognition using the type-based data construction prompt. It was collected by prompting gpt-3.5-turbo-0301 and augmented by negative sampling. Check our [project page](https://universal-ner.github.io/) for more information.
# License
Attribution-NonCommercial 4.0 International |
AdapterOcean/code_instructions_standardized_cluster_13_std | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: cluster
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 15440637
num_examples: 13810
download_size: 8250343
dataset_size: 15440637
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_13_std"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ibrahimhamamci/DENTEX | ---
title: "DENTEX Dataset"
license: cc-by-nc-sa-4.0
---
<p align="center">
<img src="https://huggingface.co/datasets/ibrahimhamamci/DENTEX/resolve/main/figures/dentex.jpg?download=true" width="100%">
</p>
Welcome to the official page of the DENTEX dataset, which has been released as part of the [Dental Enumeration and Diagnosis on Panoramic X-rays Challenge (DENTEX)](https://dentex.grand-challenge.org/), organized in conjunction with the International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI) in 2023. The primary objective of this challenge is to develop algorithms that can accurately detect abnormal teeth with dental enumeration and associated diagnosis. This not only aids in accurate treatment planning but also helps practitioners carry out procedures with a low margin of error.
The challenge provides three types of hierarchically annotated data and additional unlabeled X-rays for optional pre-training. The annotation of the data is structured using the Fédération Dentaire Internationale (FDI) system. The first set of data is partially labeled because it only includes quadrant info. The second set of data is also partially labeled but contains additional enumeration information along with the quadrant. The third set is fully labeled because it includes all quadrant-enumeration-diagnosis information for each abnormal tooth, and all participant algorithms have been benchmarked on this third set, with an example output shown below.
<p align="center">
<img src="https://huggingface.co/datasets/ibrahimhamamci/DENTEX/resolve/main/figures/output.png?download=true" width="100%">
</p>
## DENTEX Dataset
The DENTEX dataset comprises panoramic dental X-rays obtained from three different institutions using standard clinical conditions but varying equipment and imaging protocols, resulting in diverse image quality reflecting heterogeneous clinical practice. The dataset includes X-rays from patients aged 12 and above, randomly selected from the hospital's database to ensure patient privacy and confidentiality.
To enable effective use of the FDI system, the dataset is hierarchically organized into three types of data:
- (a) 693 X-rays labeled for quadrant detection and quadrant classes only,
- (b) 634 X-rays labeled for tooth detection with quadrant and tooth enumeration classes,
- (c) 1005 X-rays fully labeled for abnormal tooth detection with quadrant, tooth enumeration, and diagnosis classes.
The diagnosis class includes four specific categories: caries, deep caries, periapical lesions, and impacted teeth. An additional 1571 unlabeled X-rays are provided for pre-training.
<p align="center">
<img src="https://huggingface.co/datasets/ibrahimhamamci/DENTEX/resolve/main/figures/data.png?download=true" width="100%">
</p>
## Annotation Protocol
The DENTEX dataset provides three hierarchically annotated datasets to support various dental detection tasks: (1) quadrant-only for quadrant detection, (2) quadrant-enumeration for tooth detection, and (3) quadrant-enumeration-diagnosis for abnormal tooth detection. While offering a quadrant detection dataset might appear redundant, it's essential for effectively using the FDI Numbering System. This globally recognized system assigns numbers from 1 through 4 to each mouth quadrant: top right (1), top left (2), bottom left (3), and bottom right (4). Additionally, it numbers each of the eight teeth and each molar from 1 to 8, starting from the front middle tooth and increasing towards the back. For instance, the back tooth on the lower left side is designated as 48 in FDI notation, indicating quadrant 4, tooth 8. Thus, the quadrant segmentation dataset greatly simplifies the dental enumeration task, though evaluations are conducted only on the fully annotated third dataset.
## Data Split for Evaluation and Training
The DENTEX 2023 dataset comprises three types of data: (a) partially annotated quadrant data, (b) partially annotated quadrant-enumeration data, and (c) fully annotated quadrant-enumeration-diagnosis data. The first two types of data are intended for training and development purposes, while the third type is used for training and evaluations.
To comply with standard machine learning practices, the fully annotated third dataset, consisting of 1005 panoramic X-rays, is partitioned into training, validation, and testing subsets, comprising 705, 50, and 250 images, respectively. Ground truth labels are provided only for the training data, while the validation data is provided without associated ground truth. All the ground truth data is now available for researchers.
Note: The datasets are fully identical to the data used for our baseline method, named HierarchicalDet. For more information, please visit the [MICCAI paper](https://conferences.miccai.org/2023/papers/205-Paper2550.html) and the [GitHub repository](https://github.com/ibrahimethemhamamci/DENTEX) of HierarchicalDet (Diffusion-Based Hierarchical Multi-Label Object Detection to Analyze Panoramic Dental X-rays).
## Citing Us
If you use DENTEX, we would appreciate references to the following papers:
```
1. @article{hamamci2023dentex,
title={DENTEX: An Abnormal Tooth Detection with Dental Enumeration and Diagnosis Benchmark for Panoramic X-rays},
author={Hamamci, Ibrahim Ethem and Er, Sezgin and Simsar, Enis and Yuksel, Atif Emre and Gultekin, Sadullah and Ozdemir, Serife Damla and Yang, Kaiyuan and Li, Hongwei Bran and Pati, Sarthak and Stadlinger, Bernd and others},
journal={arXiv preprint arXiv:2305.19112},
year={2023}
}
2. @inproceedings{hamamci2023diffusion,
title={Diffusion-based hierarchical multi-label object detection to analyze panoramic dental x-rays},
author={Hamamci, Ibrahim Ethem and Er, Sezgin and Simsar, Enis and Sekuboyina, Anjany and Gundogar, Mustafa and Stadlinger, Bernd and Mehl, Albert and Menze, Bjoern},
booktitle={International Conference on Medical Image Computing and Computer-Assisted Intervention},
pages={389--399},
year={2023},
organization={Springer}
}
```
## License
We are committed to fostering innovation and collaboration in the research community. To this end, all elements of the DENTEX dataset are released under a [Creative Commons Attribution (CC-BY-NC-SA) license](https://creativecommons.org/licenses/by-nc-sa/4.0/). This licensing framework ensures that our contributions can be freely used for non-commercial research purposes, while also encouraging contributions and modifications, provided that the original work is properly cited and any derivative works are shared under similar terms.
|
Deojoandco/capstone_fromgpt_without_gold_all | ---
dataset_info:
features:
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: gold_tags
dtype: string
- name: query
dtype: string
- name: gpt_success
dtype: bool
- name: gpt_response
dtype: string
- name: gold_tags_tokens_count
dtype: int64
- name: GPT_OUTPUT_FOUND
dtype: bool
- name: gpt_output_tags
dtype: string
- name: gpt_output_tag_tokens
dtype: int64
- name: summary_gpt_tags_token_count_match
dtype: bool
- name: gpt_output_token_count
dtype: int64
- name: gpt_output_tag_count
dtype: int64
- name: summary_gpt_token_count_match
dtype: bool
splits:
- name: train
num_bytes: 537874
num_examples: 100
download_size: 85969
dataset_size: 537874
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "capstone_fromgpt_without_gold"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mboth/sichern-50-undersampled | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: Datatype
dtype: string
- name: Beschreibung
dtype: string
- name: Name
dtype: string
- name: Unit
dtype: string
- name: text
dtype: string
- name: Grundfunktion
dtype: string
- name: label
dtype:
class_label:
names:
'0': Brandmeldeanlage
'1': Brandschutzklappe
'2': Einbruchmeldeanlage
'3': Entrauchung-Ventilator
'4': Feuerlöschanlage
'5': Gaswarnanlage
'6': Notruf
'7': Rauchmeldeanlage
splits:
- name: train
num_bytes: 38006.082374966565
num_examples: 193
- name: test
num_bytes: 186480
num_examples: 935
- name: valid
num_bytes: 186480
num_examples: 935
download_size: 130269
dataset_size: 410966.0823749666
---
# Dataset Card for "sichern-50-undersampled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DylanonWic/common_voice_10_1_th_clean_split_0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: labels
sequence: int64
- name: input_values
sequence: float32
splits:
- name: train
num_bytes: 12101560609
num_examples: 50670
download_size: 11891879164
dataset_size: 12101560609
---
# Dataset Card for "common_voice_10_1_th_clean_split_0_fix_spacial_char"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_mrpc_nasal_possessive_pron | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 114277
num_examples: 409
- name: train
num_bytes: 223284
num_examples: 801
- name: validation
num_bytes: 27559
num_examples: 97
download_size: 248717
dataset_size: 365120
---
# Dataset Card for "MULTI_VALUE_mrpc_nasal_possessive_pron"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SGBTalha/MyyModels | ---
license: openrail
---
|
math_dataset | ---
pretty_name: Mathematics Dataset
language:
- en
paperswithcode_id: mathematics
dataset_info:
- config_name: algebra__linear_1d
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 516405
num_examples: 10000
- name: train
num_bytes: 92086245
num_examples: 1999998
download_size: 2333082954
dataset_size: 92602650
- config_name: algebra__linear_1d_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1018090
num_examples: 10000
- name: train
num_bytes: 199566926
num_examples: 1999998
download_size: 2333082954
dataset_size: 200585016
- config_name: algebra__linear_2d
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 666095
num_examples: 10000
- name: train
num_bytes: 126743526
num_examples: 1999998
download_size: 2333082954
dataset_size: 127409621
- config_name: algebra__linear_2d_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1184664
num_examples: 10000
- name: train
num_bytes: 234405885
num_examples: 1999998
download_size: 2333082954
dataset_size: 235590549
- config_name: algebra__polynomial_roots
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 868630
num_examples: 10000
- name: train
num_bytes: 163134199
num_examples: 1999998
download_size: 2333082954
dataset_size: 164002829
- config_name: algebra__polynomial_roots_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1281321
num_examples: 10000
- name: train
num_bytes: 251435312
num_examples: 1999998
download_size: 2333082954
dataset_size: 252716633
- config_name: algebra__sequence_next_term
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 752459
num_examples: 10000
- name: train
num_bytes: 138735194
num_examples: 1999998
download_size: 2333082954
dataset_size: 139487653
- config_name: algebra__sequence_nth_term
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 947764
num_examples: 10000
- name: train
num_bytes: 175945643
num_examples: 1999998
download_size: 2333082954
dataset_size: 176893407
- config_name: arithmetic__add_or_sub
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 483725
num_examples: 10000
- name: train
num_bytes: 89690356
num_examples: 1999998
download_size: 2333082954
dataset_size: 90174081
- config_name: arithmetic__add_or_sub_in_base
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 502221
num_examples: 10000
- name: train
num_bytes: 93779137
num_examples: 1999998
download_size: 2333082954
dataset_size: 94281358
- config_name: arithmetic__add_sub_multiple
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 498421
num_examples: 10000
- name: train
num_bytes: 90962782
num_examples: 1999998
download_size: 2333082954
dataset_size: 91461203
- config_name: arithmetic__div
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 421520
num_examples: 10000
- name: train
num_bytes: 78417908
num_examples: 1999998
download_size: 2333082954
dataset_size: 78839428
- config_name: arithmetic__mixed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 513364
num_examples: 10000
- name: train
num_bytes: 93989009
num_examples: 1999998
download_size: 2333082954
dataset_size: 94502373
- config_name: arithmetic__mul
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 394004
num_examples: 10000
- name: train
num_bytes: 73499093
num_examples: 1999998
download_size: 2333082954
dataset_size: 73893097
- config_name: arithmetic__mul_div_multiple
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 497308
num_examples: 10000
- name: train
num_bytes: 91406689
num_examples: 1999998
download_size: 2333082954
dataset_size: 91903997
- config_name: arithmetic__nearest_integer_root
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 705630
num_examples: 10000
- name: train
num_bytes: 137771237
num_examples: 1999998
download_size: 2333082954
dataset_size: 138476867
- config_name: arithmetic__simplify_surd
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1261753
num_examples: 10000
- name: train
num_bytes: 207753790
num_examples: 1999998
download_size: 2333082954
dataset_size: 209015543
- config_name: calculus__differentiate
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1025947
num_examples: 10000
- name: train
num_bytes: 199013993
num_examples: 1999998
download_size: 2333082954
dataset_size: 200039940
- config_name: calculus__differentiate_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1343416
num_examples: 10000
- name: train
num_bytes: 263757570
num_examples: 1999998
download_size: 2333082954
dataset_size: 265100986
- config_name: comparison__closest
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 681229
num_examples: 10000
- name: train
num_bytes: 132274822
num_examples: 1999998
download_size: 2333082954
dataset_size: 132956051
- config_name: comparison__closest_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1071089
num_examples: 10000
- name: train
num_bytes: 210658152
num_examples: 1999998
download_size: 2333082954
dataset_size: 211729241
- config_name: comparison__kth_biggest
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 797185
num_examples: 10000
- name: train
num_bytes: 149077463
num_examples: 1999998
download_size: 2333082954
dataset_size: 149874648
- config_name: comparison__kth_biggest_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1144556
num_examples: 10000
- name: train
num_bytes: 221547532
num_examples: 1999998
download_size: 2333082954
dataset_size: 222692088
- config_name: comparison__pair
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 452528
num_examples: 10000
- name: train
num_bytes: 85707543
num_examples: 1999998
download_size: 2333082954
dataset_size: 86160071
- config_name: comparison__pair_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 946187
num_examples: 10000
- name: train
num_bytes: 184702998
num_examples: 1999998
download_size: 2333082954
dataset_size: 185649185
- config_name: comparison__sort
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 712498
num_examples: 10000
- name: train
num_bytes: 131752705
num_examples: 1999998
download_size: 2333082954
dataset_size: 132465203
- config_name: comparison__sort_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1114257
num_examples: 10000
- name: train
num_bytes: 213871896
num_examples: 1999998
download_size: 2333082954
dataset_size: 214986153
- config_name: measurement__conversion
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 592904
num_examples: 10000
- name: train
num_bytes: 118650852
num_examples: 1999998
download_size: 2333082954
dataset_size: 119243756
- config_name: measurement__time
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 584278
num_examples: 10000
- name: train
num_bytes: 116962599
num_examples: 1999998
download_size: 2333082954
dataset_size: 117546877
- config_name: numbers__base_conversion
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 490881
num_examples: 10000
- name: train
num_bytes: 90363333
num_examples: 1999998
download_size: 2333082954
dataset_size: 90854214
- config_name: numbers__div_remainder
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 644523
num_examples: 10000
- name: train
num_bytes: 125046212
num_examples: 1999998
download_size: 2333082954
dataset_size: 125690735
- config_name: numbers__div_remainder_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1151347
num_examples: 10000
- name: train
num_bytes: 226341870
num_examples: 1999998
download_size: 2333082954
dataset_size: 227493217
- config_name: numbers__gcd
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 659492
num_examples: 10000
- name: train
num_bytes: 127914889
num_examples: 1999998
download_size: 2333082954
dataset_size: 128574381
- config_name: numbers__gcd_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1206805
num_examples: 10000
- name: train
num_bytes: 237534189
num_examples: 1999998
download_size: 2333082954
dataset_size: 238740994
- config_name: numbers__is_factor
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 396129
num_examples: 10000
- name: train
num_bytes: 75875988
num_examples: 1999998
download_size: 2333082954
dataset_size: 76272117
- config_name: numbers__is_factor_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 949828
num_examples: 10000
- name: train
num_bytes: 185369842
num_examples: 1999998
download_size: 2333082954
dataset_size: 186319670
- config_name: numbers__is_prime
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 385749
num_examples: 10000
- name: train
num_bytes: 73983639
num_examples: 1999998
download_size: 2333082954
dataset_size: 74369388
- config_name: numbers__is_prime_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 947888
num_examples: 10000
- name: train
num_bytes: 184808483
num_examples: 1999998
download_size: 2333082954
dataset_size: 185756371
- config_name: numbers__lcm
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 717978
num_examples: 10000
- name: train
num_bytes: 136826050
num_examples: 1999998
download_size: 2333082954
dataset_size: 137544028
- config_name: numbers__lcm_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1127744
num_examples: 10000
- name: train
num_bytes: 221148668
num_examples: 1999998
download_size: 2333082954
dataset_size: 222276412
- config_name: numbers__list_prime_factors
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 585749
num_examples: 10000
- name: train
num_bytes: 109982816
num_examples: 1999998
download_size: 2333082954
dataset_size: 110568565
- config_name: numbers__list_prime_factors_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1053510
num_examples: 10000
- name: train
num_bytes: 205379513
num_examples: 1999998
download_size: 2333082954
dataset_size: 206433023
- config_name: numbers__place_value
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 496977
num_examples: 10000
- name: train
num_bytes: 95180091
num_examples: 1999998
download_size: 2333082954
dataset_size: 95677068
- config_name: numbers__place_value_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1011130
num_examples: 10000
- name: train
num_bytes: 197187918
num_examples: 1999998
download_size: 2333082954
dataset_size: 198199048
- config_name: numbers__round_number
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 570636
num_examples: 10000
- name: train
num_bytes: 111472483
num_examples: 1999998
download_size: 2333082954
dataset_size: 112043119
- config_name: numbers__round_number_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1016754
num_examples: 10000
- name: train
num_bytes: 201057283
num_examples: 1999998
download_size: 2333082954
dataset_size: 202074037
- config_name: polynomials__add
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1308455
num_examples: 10000
- name: train
num_bytes: 257576092
num_examples: 1999998
download_size: 2333082954
dataset_size: 258884547
- config_name: polynomials__coefficient_named
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1137226
num_examples: 10000
- name: train
num_bytes: 219716251
num_examples: 1999998
download_size: 2333082954
dataset_size: 220853477
- config_name: polynomials__collect
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 774709
num_examples: 10000
- name: train
num_bytes: 143743260
num_examples: 1999998
download_size: 2333082954
dataset_size: 144517969
- config_name: polynomials__compose
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1209763
num_examples: 10000
- name: train
num_bytes: 233651887
num_examples: 1999998
download_size: 2333082954
dataset_size: 234861650
- config_name: polynomials__evaluate
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 599446
num_examples: 10000
- name: train
num_bytes: 114538250
num_examples: 1999998
download_size: 2333082954
dataset_size: 115137696
- config_name: polynomials__evaluate_composed
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1148362
num_examples: 10000
- name: train
num_bytes: 226022455
num_examples: 1999998
download_size: 2333082954
dataset_size: 227170817
- config_name: polynomials__expand
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1057353
num_examples: 10000
- name: train
num_bytes: 202338235
num_examples: 1999998
download_size: 2333082954
dataset_size: 203395588
- config_name: polynomials__simplify_power
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1248040
num_examples: 10000
- name: train
num_bytes: 216407582
num_examples: 1999998
download_size: 2333082954
dataset_size: 217655622
- config_name: probability__swr_p_level_set
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1159050
num_examples: 10000
- name: train
num_bytes: 227540179
num_examples: 1999998
download_size: 2333082954
dataset_size: 228699229
- config_name: probability__swr_p_sequence
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1097442
num_examples: 10000
- name: train
num_bytes: 215865725
num_examples: 1999998
download_size: 2333082954
dataset_size: 216963167
---
# Dataset Card for "math_dataset"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://github.com/deepmind/mathematics_dataset](https://github.com/deepmind/mathematics_dataset)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 130.65 GB
- **Size of the generated dataset:** 9.08 GB
- **Total amount of disk used:** 139.73 GB
### Dataset Summary
Mathematics database.
This dataset code generates mathematical question and answer pairs,
from a range of question types at roughly school-level difficulty.
This is designed to test the mathematical learning and algebraic
reasoning skills of learning models.
Original paper: Analysing Mathematical Reasoning Abilities of Neural Models
(Saxton, Grefenstette, Hill, Kohli).
Example usage:
train_examples, val_examples = datasets.load_dataset(
'math_dataset/arithmetic__mul',
split=['train', 'test'],
as_supervised=True)
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### algebra__linear_1d
- **Size of downloaded dataset files:** 2.33 GB
- **Size of the generated dataset:** 92.60 MB
- **Total amount of disk used:** 2.43 GB
An example of 'train' looks as follows.
```
```
#### algebra__linear_1d_composed
- **Size of downloaded dataset files:** 2.33 GB
- **Size of the generated dataset:** 200.58 MB
- **Total amount of disk used:** 2.53 GB
An example of 'train' looks as follows.
```
```
#### algebra__linear_2d
- **Size of downloaded dataset files:** 2.33 GB
- **Size of the generated dataset:** 127.41 MB
- **Total amount of disk used:** 2.46 GB
An example of 'train' looks as follows.
```
```
#### algebra__linear_2d_composed
- **Size of downloaded dataset files:** 2.33 GB
- **Size of the generated dataset:** 235.59 MB
- **Total amount of disk used:** 2.57 GB
An example of 'train' looks as follows.
```
```
#### algebra__polynomial_roots
- **Size of downloaded dataset files:** 2.33 GB
- **Size of the generated dataset:** 164.01 MB
- **Total amount of disk used:** 2.50 GB
An example of 'train' looks as follows.
```
```
### Data Fields
The data fields are the same among all splits.
#### algebra__linear_1d
- `question`: a `string` feature.
- `answer`: a `string` feature.
#### algebra__linear_1d_composed
- `question`: a `string` feature.
- `answer`: a `string` feature.
#### algebra__linear_2d
- `question`: a `string` feature.
- `answer`: a `string` feature.
#### algebra__linear_2d_composed
- `question`: a `string` feature.
- `answer`: a `string` feature.
#### algebra__polynomial_roots
- `question`: a `string` feature.
- `answer`: a `string` feature.
### Data Splits
| name | train |test |
|---------------------------|------:|----:|
|algebra__linear_1d |1999998|10000|
|algebra__linear_1d_composed|1999998|10000|
|algebra__linear_2d |1999998|10000|
|algebra__linear_2d_composed|1999998|10000|
|algebra__polynomial_roots |1999998|10000|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{2019arXiv,
author = {Saxton, Grefenstette, Hill, Kohli},
title = {Analysing Mathematical Reasoning Abilities of Neural Models},
year = {2019},
journal = {arXiv:1904.01557}
}
```
### Contributions
Thanks to [@patrickvonplaten](https://github.com/patrickvonplaten), [@lewtun](https://github.com/lewtun), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
helloAQ/small_data | ---
license: apache-2.0
---
|
Geonmo/deepfashion-multimodal-descriptions | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9586020
num_examples: 40770
download_size: 2270474
dataset_size: 9586020
---
# Dataset Card for "deepfashion-multimodal-descriptions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
brayene/tr-ChatGPT-Jailbreak-Prompts | ---
dataset_info:
features:
- name: Name
dtype: string
- name: Prompt
dtype: string
- name: Votes
dtype: int64
- name: Jailbreak Score
dtype: int64
- name: GPT-4
dtype: string
- name: translation
dtype: string
splits:
- name: train
num_bytes: 324859
num_examples: 79
download_size: 166205
dataset_size: 324859
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Abzu/dolly_hhrlhf_wizard | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 108006083.60236111
num_examples: 84468
- name: test
num_bytes: 12001528.397638885
num_examples: 9386
download_size: 67011577
dataset_size: 120007612.0
---
# Dataset Card for "dolly_hhrlhf_wizard"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
conll2002 | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- es
- nl
license:
- unknown
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
- part-of-speech
paperswithcode_id: conll-2002
pretty_name: CoNLL-2002
dataset_info:
- config_name: es
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': AO
'1': AQ
'2': CC
'3': CS
'4': DA
'5': DE
'6': DD
'7': DI
'8': DN
'9': DP
'10': DT
'11': Faa
'12': Fat
'13': Fc
'14': Fd
'15': Fe
'16': Fg
'17': Fh
'18': Fia
'19': Fit
'20': Fp
'21': Fpa
'22': Fpt
'23': Fs
'24': Ft
'25': Fx
'26': Fz
'27': I
'28': NC
'29': NP
'30': P0
'31': PD
'32': PI
'33': PN
'34': PP
'35': PR
'36': PT
'37': PX
'38': RG
'39': RN
'40': SP
'41': VAI
'42': VAM
'43': VAN
'44': VAP
'45': VAS
'46': VMG
'47': VMI
'48': VMM
'49': VMN
'50': VMP
'51': VMS
'52': VSG
'53': VSI
'54': VSM
'55': VSN
'56': VSP
'57': VSS
'58': Y
'59': Z
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 6672173
num_examples: 8324
- name: validation
num_bytes: 1333784
num_examples: 1916
- name: test
num_bytes: 1294156
num_examples: 1518
download_size: 4140690
dataset_size: 9300113
- config_name: nl
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: pos_tags
sequence:
class_label:
names:
'0': Adj
'1': Adv
'2': Art
'3': Conj
'4': Int
'5': Misc
'6': N
'7': Num
'8': Prep
'9': Pron
'10': Punc
'11': V
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 5308959
num_examples: 15807
- name: validation
num_bytes: 994298
num_examples: 2896
- name: test
num_bytes: 1808862
num_examples: 5196
download_size: 3642241
dataset_size: 8112119
config_names:
- es
- nl
---
# Dataset Card for CoNLL-2002
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [homepage](https://www.clips.uantwerpen.be/conll2002/ner/)
- **Repository:** [github](https://github.com/teropa/nlp/tree/master/resources/corpora/conll2002)
- **Paper:** [paper](https://www.aclweb.org/anthology/W02-2024/)
- **Point of Contact:** [Erik Tjong Kim Sang](erikt@uia.ua.ac.be)
### Dataset Summary
Named entities are phrases that contain the names of persons, organizations, locations, times and quantities. Example:
[PER Wolff] , currently a journalist in [LOC Argentina] , played with [PER Del Bosque] in the final years of the seventies in [ORG Real Madrid] .
The shared task of CoNLL-2002 concerns language-independent named entity recognition. We will concentrate on four types of named entities: persons, locations, organizations and names of miscellaneous entities that do not belong to the previous three groups. The participants of the shared task will be offered training and test data for at least two languages. They will use the data for developing a named-entity recognition system that includes a machine learning component. Information sources other than the training data may be used in this shared task. We are especially interested in methods that can use additional unannotated data for improving their performance (for example co-training).
### Supported Tasks and Leaderboards
Named Entity Recognition (NER) is a subtask of Information Extraction. Different NER systems were evaluated as a part of the Sixth Message Understanding Conference in 1995 (MUC6). The target language was English. The participating systems performed well. However, many of them used language-specific resources for performing the task and it is unknown how they would have performed on another language than English.
After 1995 NER systems have been developed for some European languages and a few Asian languages. There have been at least two studies that have applied one NER system to different languages. Palmer and Day [PD97] have used statistical methods for finding named entities in newswire articles in Chinese, English, French, Japanese, Portuguese and Spanish. They found that the difficulty of the NER task was different for the six languages but that a large part of the task could be performed with simple methods. Cucerzan and Yarowsky [CY99] used both morphological and contextual clues for identifying named entities in English, Greek, Hindi, Rumanian and Turkish. With minimal supervision, they obtained overall F measures between 40 and 70, depending on the languages used.
- `named-entity-recognition`: The performance in this task is measured with [F1](https://huggingface.co/metrics/f1) (higher is better). A named entity is correct only if it is an exact match of the corresponding entity in the data.
- `parsing`: The performance in this task is measured with [F1](https://huggingface.co/metrics/f1) (higher is better). A part-of-speech tag is correct only if it is equal to the corresponding tag in the data.
### Languages
There are two languages available : Spanish (es) and Dutch (nl).
## Dataset Structure
### Data Instances
The examples look like this :
```
{'id': '0',
'ner_tags': [5, 6, 0, 0, 0, 0, 3, 0, 0],
'pos_tags': [4, 28, 13, 59, 28, 21, 29, 22, 20],
'tokens': ['La', 'Coruña', ',', '23', 'may', '(', 'EFECOM', ')', '.']
}
```
The original data files within the Dutch sub-dataset have `-DOCSTART-` lines used to separate documents, but these lines are removed here.
Indeed `-DOCSTART-` is a special line that acts as a boundary between two different documents, and it is filtered out in this implementation.
### Data Fields
- `id`: id of the sample
- `tokens`: the tokens of the example text
- `ner_tags`: the NER tags of each token
- `pos_tags`: the POS tags of each token
The POS tags correspond to this list for Spanish:
```
'AO', 'AQ', 'CC', 'CS', 'DA', 'DE', 'DD', 'DI', 'DN', 'DP', 'DT', 'Faa', 'Fat', 'Fc', 'Fd', 'Fe', 'Fg', 'Fh', 'Fia', 'Fit', 'Fp', 'Fpa', 'Fpt', 'Fs', 'Ft', 'Fx', 'Fz', 'I', 'NC', 'NP', 'P0', 'PD', 'PI', 'PN', 'PP', 'PR', 'PT', 'PX', 'RG', 'RN', 'SP', 'VAI', 'VAM', 'VAN', 'VAP', 'VAS', 'VMG', 'VMI', 'VMM', 'VMN', 'VMP', 'VMS', 'VSG', 'VSI', 'VSM', 'VSN', 'VSP', 'VSS', 'Y', 'Z'
```
And this list for Dutch:
```
'Adj', 'Adv', 'Art', 'Conj', 'Int', 'Misc', 'N', 'Num', 'Prep', 'Pron', 'Punc', 'V'
```
The NER tags correspond to this list:
```
"O", "B-PER", "I-PER", "B-ORG", "I-ORG", "B-LOC", "I-LOC", "B-MISC", "I-MISC",
```
The NER tags have the same format as in the chunking task: a B denotes the first item of a phrase and an I any non-initial word. There are four types of phrases: person names (PER), organizations (ORG), locations (LOC) and miscellaneous names (MISC).
It is assumed that named entities are non-recursive and non-overlapping. In case a named entity is embedded in another named entity usually, only the top level entity is marked.
### Data Splits
For both configurations (Spanish and Dutch), there are three splits.
The original splits were named `train`, `testa` and `testb` and they correspond to the `train`, `validation` and `test` splits.
The splits have the following sizes :
| | train | validation | test |
| ----- |-------:|------------:|------:|
| N. Examples (Spanish) | 8324 | 1916 | 1518 |
| N. Examples (Dutch) | 15807 | 2896 | 5196 |
## Dataset Creation
### Curation Rationale
The dataset was introduced to introduce new resources to two languages that were under-served for statistical machine learning at the time, Dutch and Spanish.
[More Information Needed]
### Source Data
The Spanish data is a collection of news wire articles made available by the Spanish EFE News Agency. The articles are from May 2000.
The Dutch data consist of four editions of the Belgian newspaper "De Morgen" of 2000 (June 2, July 1, August 1 and September 1).
#### Initial Data Collection and Normalization
The articles were word-tokenized, information on the exact pre-processing pipeline is unavailable.
#### Who are the source language producers?
The source language was produced by journalists and writers employed by the news agency and newspaper mentioned above.
### Annotations
#### Annotation process
For the Dutch data, the annotator has followed the MITRE and SAIC guidelines for named entity recognition (Chinchor et al., 1999) as well as possible.
#### Who are the annotators?
The Spanish data annotation was carried out by the TALP Research Center of the Technical University of Catalonia (UPC) and the Center of Language and Computation (CLiC) of the University of Barcelona (UB).
The Dutch data was annotated as a part of the Atranos project at the University of Antwerp.
### Personal and Sensitive Information
The data is sourced from newspaper source and only contains mentions of public figures or individuals
## Considerations for Using the Data
### Social Impact of Dataset
Named Entity Recognition systems can be used to efficiently index news text, allowing to easily gather all information pertaining to an organization or individual. Making such resources widely available in languages other than English can support better research and user experience for a larger part of the world's population. At the same time, better indexing and discoverability can also enable surveillance by state actors.
### Discussion of Biases
News text reproduces the biases of society, and any system trained on news data should be cognizant of these limitations and the risk for models to learn spurious correlations in this context, for example between a person's gender and their occupation.
### Other Known Limitations
Users should keep in mind that the dataset only contains news text, which might limit the applicability of the developed systems to other domains.
## Additional Information
### Dataset Curators
The annotation of the Spanish data was funded by the European Commission through the NAMIC project (IST-1999-12392).
### Licensing Information
The licensing status of the data, especially the news source text, is unknown.
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@inproceedings{tjong-kim-sang-2002-introduction,
title = "Introduction to the {C}o{NLL}-2002 Shared Task: Language-Independent Named Entity Recognition",
author = "Tjong Kim Sang, Erik F.",
booktitle = "{COLING}-02: The 6th Conference on Natural Language Learning 2002 ({C}o{NLL}-2002)",
year = "2002",
url = "https://www.aclweb.org/anthology/W02-2024",
}
```
### Contributions
Thanks to [@lhoestq](https://github.com/lhoestq) for adding this dataset. |
smangrul/MuDoConv | ---
license: cc-by-nc-4.0
---
Collated datasets from 10 sources and preprocessed it to have ["texts", "labels"] columns to train/finetune sequence-to-sequence models such as T5/Blenderbot ... Below are the 10 datasets:
1. blended_skill_talk,
2. conv_ai_2
3. empathetic_dialogues
4. wizard_of_wikipedia
5. meta_woz
6. multi_woz,
7. spolin
8. dailydialog
9. cornell_movie_dialogues
10. taskmaster
The data access and preprocessing code is [here](https://github.com/pacman100/accelerate-deepspeed-test/blob/main/src/data_preprocessing/DataPreprocessing.ipynb) |
LimYeri/leetcode_with_youtube_captions | ---
language:
- en
license: mit
size_categories:
- 10K<n<100K
task_categories:
- text-classification
- text-generation
pretty_name: Leetcode informations with youtube captions
tags:
- code
dataset_info:
features:
- name: cc_content
dtype: string
- name: id
dtype: int64
- name: thumbnail
dtype: string
- name: title
dtype: string
- name: question_content
dtype: string
- name: java
dtype: string
- name: c++
dtype: string
- name: python
dtype: string
- name: javascript
dtype: string
- name: title_slug
dtype: string
- name: tag
dtype: string
- name: level
dtype: string
- name: success_rate
dtype: float64
- name: total_submission
dtype: float64
- name: total_accepted
dtype: float64
- name: question_likes
dtype: float64
- name: question_dislikes
dtype: float64
- name: question_hints
dtype: string
- name: similar_question_ids
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 576312572
num_examples: 18136
download_size: 150441753
dataset_size: 576312572
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Use this data(as a team) -> [kreimben/leetcode_with_youtube_captions](https://huggingface.co/datasets/kreimben/leetcode_with_youtube_captions)
Calculate the number of tokens in ['cc_content'] using "tiktoken" -> new column ['num_token'] |
veeeeee/lamini_docs_processed | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1934677.8
num_examples: 1134
- name: test
num_bytes: 214964.2
num_examples: 126
download_size: 634920
dataset_size: 2149642.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Qdrant/dbpedia-entities-openai3-text-embedding-3-large-3072-1M | ---
language:
- en
license: apache-2.0
size_categories:
- 1M<n<10M
task_categories:
- feature-extraction
pretty_name: OpenAI v3 Large 1M
dataset_info:
features:
- name: _id
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: text-embedding-ada-002-1536-embedding
sequence: float32
- name: text-embedding-3-large-3072-embedding
sequence: float64
splits:
- name: train
num_bytes: 31115725776
num_examples: 1000000
download_size: 24796927580
dataset_size: 31115725776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
1M OpenAI Embeddings: text-embedding-3-large 3072 dimensions + ada-002 1536 dimensions — parallel dataset
- Created: February 2024.
- Text used for Embedding: title (string) + text (string)
- Embedding Model: text-embedding-3-large
- This dataset was generated from the first 1M entries of https://huggingface.co/datasets/BeIR/dbpedia-entity, extracted by @KShivendu_ [here](https://huggingface.co/datasets/KShivendu/dbpedia-entities-openai-1M)
|
alvations/esci-data-task1 | ---
license: other
dataset_info:
features:
- name: example_id
dtype: int64
- name: query
dtype: string
- name: query_id
dtype: int64
- name: product_id
dtype: string
- name: product_locale
dtype: string
- name: esci_label
dtype: string
- name: small_version
dtype: int64
- name: large_version
dtype: int64
- name: split
dtype: string
- name: product_title
dtype: string
- name: product_description
dtype: string
- name: product_bullet_point
dtype: string
- name: product_brand
dtype: string
- name: product_color
dtype: string
- name: gain
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1030417721
num_examples: 777248
- name: dev
num_bytes: 5890341
num_examples: 4390
- name: test
num_bytes: 445424864
num_examples: 336373
download_size: 726913948
dataset_size: 1481732926
---
|
vietgpt-archive/vung-oi-reward-data | ---
dataset_info:
features:
- name: prompt
struct:
- name: option
list:
- name: answer_raw
dtype: string
- name: key
dtype: string
- name: question
dtype: string
- name: chocie
struct:
- name: answer_raw
dtype: string
- name: key
dtype: string
- name: eject
struct:
- name: answer_raw
dtype: string
- name: key
dtype: string
splits:
- name: train
num_bytes: 71784563
num_examples: 112037
download_size: 41832387
dataset_size: 71784563
---
# Dataset Card for "vung-oi-reward-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mkhalifa/BioCite | ---
license: apache-2.0
tags:
- attribution
- citation
- pretraining
- synthetic
pretty_name: BioCite
paper: https://arxiv.org/abs/2404.01019
---
This is the synthetic dataset used for pretraining in the paper [Source-Aware Training Enables Knowledge Attribution in Language Models
](https://arxiv.org/abs/2404.01019).
**Stats** (number of tokens is computed based on the TinyLLaMa tokenizer):
| | Size |
|--------------------------|---------|
| **Pretraining** | |
| \#documents | 100K |
| \#facts/sents | 408K |
| \#tokens | 5.7M |
| avg. sents per doc | 4.1 |
| avg. tokens per doc | 56.9 |
| **Instruction tuning** | |
| \#examples | 186K |
| \#tokens | 3.1M |
|
Schandkroete/SLC_Sentiment_Analysis | ---
task_categories:
- text-classification
---
This is information about the dataset |
SuperSecureHuman/chandamama_trial_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 9058342.0
num_examples: 48
download_size: 9060393
dataset_size: 9058342.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31 | ---
pretty_name: Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31](https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T02:13:58.257879](https://huggingface.co/datasets/open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31/blob/main/results_2024-01-22T02-13-58.257879.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5330985732271527,\n\
\ \"acc_stderr\": 0.034185007803077,\n \"acc_norm\": 0.5352323665963996,\n\
\ \"acc_norm_stderr\": 0.034920748737001794,\n \"mc1\": 0.35006119951040393,\n\
\ \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5134609475665187,\n\
\ \"mc2_stderr\": 0.014908191115467387\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693028\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6441943835889266,\n\
\ \"acc_stderr\": 0.004777782584817781,\n \"acc_norm\": 0.8419637522405895,\n\
\ \"acc_norm_stderr\": 0.003640294912838683\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5263157894736842,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.5263157894736842,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723463,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723463\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4682080924855491,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.4682080924855491,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171451,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171451\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246487,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246487\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.02757596072327824,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.02757596072327824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103872,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6363636363636364,\n \"acc_stderr\": 0.03427308652999934,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03427308652999934\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.03275264467791516,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.03275264467791516\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4794871794871795,\n \"acc_stderr\": 0.025329663163489943,\n\
\ \"acc_norm\": 0.4794871794871795,\n \"acc_norm_stderr\": 0.025329663163489943\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.46218487394957986,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.46218487394957986,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829114,\n \"\
acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829114\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560524,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560524\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n \"\
acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7130801687763713,\n \"acc_stderr\": 0.02944377302259469,\n \
\ \"acc_norm\": 0.7130801687763713,\n \"acc_norm_stderr\": 0.02944377302259469\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.0484674825397724,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.0484674825397724\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.7606837606837606,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7650063856960408,\n\
\ \"acc_stderr\": 0.015162024152278434,\n \"acc_norm\": 0.7650063856960408,\n\
\ \"acc_norm_stderr\": 0.015162024152278434\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.026226158605124655,\n\
\ \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.026226158605124655\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.01520103251252044,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.01520103251252044\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.0282135041778241,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.0282135041778241\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.02809924077580955,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.02809924077580955\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.02949482760014437,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.02949482760014437\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39960886571056065,\n\
\ \"acc_stderr\": 0.01251018163696068,\n \"acc_norm\": 0.39960886571056065,\n\
\ \"acc_norm_stderr\": 0.01251018163696068\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003483,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003483\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5849673202614379,\n \"acc_stderr\": 0.01993362777685742,\n \
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.01993362777685742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913508,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913508\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4816326530612245,\n \"acc_stderr\": 0.031987615467631264,\n\
\ \"acc_norm\": 0.4816326530612245,\n \"acc_norm_stderr\": 0.031987615467631264\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6368159203980099,\n\
\ \"acc_stderr\": 0.034005985055990146,\n \"acc_norm\": 0.6368159203980099,\n\
\ \"acc_norm_stderr\": 0.034005985055990146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35006119951040393,\n\
\ \"mc1_stderr\": 0.01669794942015103,\n \"mc2\": 0.5134609475665187,\n\
\ \"mc2_stderr\": 0.014908191115467387\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.01056902112282591\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34268385140257773,\n \
\ \"acc_stderr\": 0.01307303023082791\n }\n}\n```"
repo_url: https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|arc:challenge|25_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|gsm8k|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hellaswag|10_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T02-13-58.257879.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- '**/details_harness|winogrande|5_2024-01-22T02-13-58.257879.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T02-13-58.257879.parquet'
- config_name: results
data_files:
- split: 2024_01_22T02_13_58.257879
path:
- results_2024-01-22T02-13-58.257879.parquet
- split: latest
path:
- results_2024-01-22T02-13-58.257879.parquet
---
# Dataset Card for Evaluation run of kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31](https://huggingface.co/kimwooglae/AISquare-Instruct-SOLAR-10.7b-v0.5.31) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T02:13:58.257879](https://huggingface.co/datasets/open-llm-leaderboard/details_kimwooglae__AISquare-Instruct-SOLAR-10.7b-v0.5.31/blob/main/results_2024-01-22T02-13-58.257879.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5330985732271527,
"acc_stderr": 0.034185007803077,
"acc_norm": 0.5352323665963996,
"acc_norm_stderr": 0.034920748737001794,
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.5134609475665187,
"mc2_stderr": 0.014908191115467387
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650649,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693028
},
"harness|hellaswag|10": {
"acc": 0.6441943835889266,
"acc_stderr": 0.004777782584817781,
"acc_norm": 0.8419637522405895,
"acc_norm_stderr": 0.003640294912838683
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5263157894736842,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.5263157894736842,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4682080924855491,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.4682080924855491,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171451,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171451
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246487,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246487
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.02757596072327824,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.02757596072327824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03427308652999934,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03427308652999934
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4794871794871795,
"acc_stderr": 0.025329663163489943,
"acc_norm": 0.4794871794871795,
"acc_norm_stderr": 0.025329663163489943
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46218487394957986,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.46218487394957986,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.020504729013829114,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.020504729013829114
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560524,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560524
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7130801687763713,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.7130801687763713,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.0484674825397724,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.0484674825397724
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7650063856960408,
"acc_stderr": 0.015162024152278434,
"acc_norm": 0.7650063856960408,
"acc_norm_stderr": 0.015162024152278434
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.026226158605124655,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.026226158605124655
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.01520103251252044,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.01520103251252044
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.0282135041778241,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.0282135041778241
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.02809924077580955,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.02809924077580955
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.02949482760014437,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.02949482760014437
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39960886571056065,
"acc_stderr": 0.01251018163696068,
"acc_norm": 0.39960886571056065,
"acc_norm_stderr": 0.01251018163696068
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003483,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003483
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913508,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913508
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4816326530612245,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.4816326530612245,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6368159203980099,
"acc_stderr": 0.034005985055990146,
"acc_norm": 0.6368159203980099,
"acc_norm_stderr": 0.034005985055990146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35006119951040393,
"mc1_stderr": 0.01669794942015103,
"mc2": 0.5134609475665187,
"mc2_stderr": 0.014908191115467387
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.01056902112282591
},
"harness|gsm8k|5": {
"acc": 0.34268385140257773,
"acc_stderr": 0.01307303023082791
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Nexdata/Burmese_Spontaneous_Speech_Data | ---
task_categories:
- automatic-speech-recognition
language:
- my
---
# Dataset Card for Nexdata/Burmese_Spontaneous_Speech_Data
## Description
The 212 Hours - Burmese Spontaneous Speech Data is a collection of speech clips, the content covering multiple topics. All the speech audio was manually transcribed into text content; speaker identity, gender, and other attribution are also annotated. This dataset can be used for voiceprint recognition model training, corpus construction for machine translation, and algorithm research introduction
For more details, please refer to the link: https://www.nexdata.ai/datasets/1272?source=Huggingface
# Specifications
## Format
16kHz, 16bit, mono channel;
## Content category
including service, conversation, interview, etc.
## Language
Burmese;
## Annotation
annotation for the transcription text, speaker identification, gender;
## Application scenarios
speech recognition, video caption generation and video content review;
## Accuracy
at a word Accuracy Rate (WAR) of being no less than 98%.
# Licensing Information
Commercial License |
RenatoBC/markfinley2 | ---
license: openrail
---
|
Craque/voz_Ze | ---
license: openrail
---
|
wenqiglantz/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966694
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
This is a subset (1000 samples) of [`timdettmers/openassistant-guanaco`](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) dataset, processed to match Mistral-7B-instruct-v0.2's prompt format as described [in this article](https://huggingface.co/blog/llama2#how-to-prompt-llama-2). It was created using the [colab notebook](https://colab.research.google.com/drive/1afeicfJa9Mo8-wEcDoGrjyoVLyFkF9xm?usp=sharing).
Inspired by Maxime Labonne's [llm-course repo](https://github.com/mlabonne/llm-course). |
PRACADACERA/Dragon | ---
license: openrail
---
|
Trollator/mcigu | ---
license: openrail
---
|
fahuamancaja/file_contents | ---
language:
- en
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 227417
num_examples: 85
download_size: 33719
dataset_size: 227417
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sleeping4cat/8chan | ---
license: bigscience-openrail-m
language:
- en
pretty_name: scarlet-dark
---
#### Overview
The 8chan initiative emanates from the Sleeping AI Lab, aiming to furnish the research community with a superlative dataset in unconventional domains. Aligned with Sleeping AI's Datasets initiative, our commitment is directed toward furnishing premium Open Source datasets, with 8chan standing out as a singular endeavour that delves into a segment of the Internet colloquially known as the Dark Web, often deemed taboo in mainstream media.
Our contribution encompasses media and image data extracted from the 8chan image board, serving as the enigmatic counterpart to the 4chan platform. This pioneering dataset distinguishes itself within the research community, being the inaugural compilation exclusively dedicated to 8chan. We trust that the community will exercise due responsibility in its utilisation.
#### Technical Aspects of the Dataset
Within this dataset, designated as **kun** vartaint, individuals may discover unaltered media and images sourced from the entirety of the 8chan platform, encompassing all 488 boards and their respective threads up until November 30, 2023. Subsequently, these data were downloaded and stored within the overarching super-folder 'kun,' with each board's collected data residing in its distinct folder identified by the nomenclature "board = foldername."
Metadata pertaining to the uploaded user accompanies the media, inclusive of sensitive information. Accordingly, we implore researchers to employ this information ethically and responsibly. Despite the presence of numerous corrupted images and media, their discerning examination reveals commensurate value.
For practical engagement with the images, an image-specific subset of the scraped data embedding has been disseminated on Kaggle, fostering the development of robust and innovative models.
Subsequently, we introduced a refined variant, **clear_kun**, representing a preprocessed and sanitised iteration of the dataset. This version, comprising 3882 images, has undergone meticulous filtering to eliminate corruptions and serves as the foundation for generating embeddings.
#### Liability
It is imperative to clarify that any potential misuse by third parties absolves the undersigned of responsibility. We uphold a stringent request policy, necessitating interested parties to submit requests for dataset access, which will be individually reviewed. Researchers are strongly encouraged to uphold privacy and adhere to ethical guidelines, with any inadvertent misuse falling outside the purview of responsibility.
The release of this dataset is expressly intended for academic and research purposes, encompassing content should be viewed by 20 and above older individuals.
For inquiries or concerns, please direct correspondence to *sleeping4cat@outlook.com.*
Kaggle (Image Embedding): https://www.kaggle.com/datasets/sleepingcat4/8chan-image-embeddings/data |
malucoelhaofc/MackeyV2 | ---
license: openrail
---
|
dsupa/hack5-IQ-HP-FFT | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
'2': '2'
'3': '3'
'4': '4'
'5': '5'
'6': '6'
splits:
- name: train
num_bytes: 3876445.0
num_examples: 647
download_size: 3833722
dataset_size: 3876445.0
---
# Dataset Card for "hack5-IQ-HP-FFT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PedroDKE/LibriS2S | ---
annotations_creators: []
language:
- en
- de
language_creators: []
license:
- cc-by-nc-sa-4.0
multilinguality:
- multilingual
pretty_name: LibriS2S German-English Speech and Text pairs
size_categories:
- 10K<n<100K
source_datasets: []
tags:
- LibriS2S
- LibrivoxDeEn
- Speech-to-Speech translation
- LREC2022
task_categories:
- text-to-speech
- automatic-speech-recognition
- translation
task_ids: []
---
# LibriS2S
This repo contains scripts and alignment data to create a dataset build further upon [librivoxDeEn](https://www.cl.uni-heidelberg.de/statnlpgroup/librivoxdeen/) such that it contains (German audio, German transcription, English audio, English transcription) quadruplets and can be used for Speech-to-Speech translation research. Because of this, the alignments are released under the same [Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License](https://creativecommons.org/licenses/by-nc-sa/4.0/) <div>
These alignments were collected by downloading the English audiobooks and using [aeneas](https://github.com/readbeyond/aeneas) to align the book chapters to the transcripts. For more information read the original [paper](https://arxiv.org/abs/2204.10593) (Presented at LREC 2022)
### The data
The English/German audio are available in the folder EN/DE respectively and can be downloaded from [this onedrive](https://onedrive.live.com/embed?cid=DCE49ACC2BDA7D8C&resid=DCE49ACC2BDA7D8C%2115663&authkey=ANmUz8gRUoyxmjk). In case there are any problems with the download, feel free to open an issue here or on [GitHub](https://github.com/PedroDKE/LibriS2S). <br/>
The repo structure is as follow:
- Alignments : Contains all the alignments for each book and chapter
- DE : Contains the German audio for each chapter per book.
- EN : Contains the English audio for each chapter per book.
- Example : contains example files on for the scraping and aligning explanations that were used to build this dataset.
- LibrivoxDeEn_alignments : Contains the base alignments from the LibrivoxDeEn dataset. <br/>
In case you feel a part of the data is missing, feel free to open an issue!
The full zipfile is about 52 GB of size.
### Scraping a book from Librivox
To download all chapters from a librivox url the following command can be used:
```
python scrape_audio_from_librivox.py \
--url https://librivox.org/undine-by-friedrich-de-la-motte-fouque/ \
--save_dir ./examples
```
### Allign a book from Librivox with the text from LibrivoxDeEn
To allign the previously downloaded book with the txt files and tsv tables provided by LibrivoxDeEn the following command, based on the example provided with this repo, can be used:
```
python align_text_and_audio.py \
--text_dir ./example/en_text/ \
--audio_path ./example/audio_chapters/ \
--aeneas_path ./example/aeneas/ \
--en_audio_export_path ./example/sentence_level_audio/ \
--total_alignment_path ./example/bi-lingual-alignment/ \
--librivoxdeen_alignment ./example/undine_data.tsv \
--aeneas_head_max 120 \
--aeneas_tail_min 5 \
```
**note:** the example folder in this repo already contains the first two chapters from [Undine](https://librivox.org/undine-by-friedrich-de-la-motte-fouque/) scraped from librivox and their transcripts and (modified to only contain the first 2 chapters) tsv table retrieved from LibrivoxDeEn.
Additional data to align can be scraped by using the same file shown previously and combined with the provided data from LibriVoxDeEn
Additionally with this repo the full alignment for the 8 following books with following LibrivoxDeEn id's are also given:
[9](https://librivox.org/the-picture-of-dorian-gray-1891-version-by-oscar-wilde/), [10](https://librivox.org/pandoras-box-by-frank-wedekind/), [13](https://librivox.org/survivors-of-the-chancellor-by-jules-verne/), [18](https://librivox.org/undine-by-friedrich-de-la-motte-fouque/), [23](https://librivox.org/around-the-world-in-80-days-by-jules-verne/), [108](https://librivox.org/elective-affinities-by-johann-wolfgang-von-goethe/), [110](https://librivox.org/candide-by-voltaire-3/), [120](https://librivox.org/the-metamorphosis-by-franz-kafka/).
Other books such as [11](https://librivox.org/the-castle-of-otranto-by-horace-walpole/), [36](https://librivox.org/the-rider-on-the-white-horse-by-theodor-storm/), [67](https://librivox.org/frankenstein-or-the-modern-prometheus-1818-by-mary-wollstonecraft-shelley/) and [54](https://librivox.org/white-nights-other-stories-by-fyodor-dostoyevsky/) are also inside of the librivoxDeEn dataset but the chapters do not correspond in a 1:1 mannner(for example: the German version of book 67 has 27 chapters but the English version has 29 and thus need to be re-aligned before the allignment script in this repo will work). Therefore these alignments are given but might have be different if you scrape them yourselves as the re-alignments might be different for you.
### Metrics on the alignment given in this repo.
Using the alignments given in this repo some metrics were collected and quickly displayed here, for this table and the next figure the books which were manually alligned, although provided in the zip, were not accounted for, but the full table can be found in the original paper.
| | German | English |
| :---: | :-: | :-: |
|number of files | 18868 | 18868 |
|total time (hh:mm:ss) | 39:11:08 | 40:52:31 |
|Speakers | 41 |22 |
note: the speakers were counted for each book seperatly so some speakers might be counter more than once.
the number of hours for each book aligned in this repo:<br>
<img src="https://user-images.githubusercontent.com/43861296/122250648-1f5f7f80-ceca-11eb-84fd-344a2261bf47.png" width="500">
when using this work, please cite the original paper and the LibrivoxDeEn authors
```
@inproceedings{jeuris-niehues-2022-libris2s,
title = "{L}ibri{S}2{S}: A {G}erman-{E}nglish Speech-to-Speech Translation Corpus",
author = "Jeuris, Pedro and
Niehues, Jan",
booktitle = "Proceedings of the Thirteenth Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lrec-1.98",
pages = "928--935",
abstract = "Recently, we have seen an increasing interest in the area of speech-to-text translation. This has led to astonishing improvements in this area. In contrast, the activities in the area of speech-to-speech translation is still limited, although it is essential to overcome the language barrier. We believe that one of the limiting factors is the availability of appropriate training data. We address this issue by creating LibriS2S, to our knowledge the first publicly available speech-to-speech training corpus between German and English. For this corpus, we used independently created audio for German and English leading to an unbiased pronunciation of the text in both languages. This allows the creation of a new text-to-speech and speech-to-speech translation model that directly learns to generate the speech signal based on the pronunciation of the source language. Using this created corpus, we propose Text-to-Speech models based on the example of the recently proposed FastSpeech 2 model that integrates source language information. We do this by adapting the model to take information such as the pitch, energy or transcript from the source speech as additional input.",
}
```
```
@article{beilharz19,
title = {LibriVoxDeEn: A Corpus for German-to-English Speech Translation and Speech Recognition},
author = {Beilharz, Benjamin and Sun, Xin and Karimova, Sariya and Riezler, Stefan},
journal = {Proceedings of the Language Resources and Evaluation Conference},
journal-abbrev = {LREC},
year = {2020},
city = {Marseille, France},
url = {https://arxiv.org/pdf/1910.07924.pdf}
}
```
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.