datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
passionMan/usda_tokenized_source | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 1314644
num_examples: 5628
- name: test
num_bytes: 437798
num_examples: 1876
download_size: 434891
dataset_size: 1752442
---
# Dataset Card for "usda_tokenized_source"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bigfish87/test | ---
license: openrail
---
|
anan-2024/twitter_dataset_1712976471 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 276649
num_examples: 759
download_size: 151493
dataset_size: 276649
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_stsb_invariant_tag_amnt | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 162
num_examples: 1
- name: test
num_bytes: 192
num_examples: 1
download_size: 5842
dataset_size: 354
---
# Dataset Card for "MULTI_VALUE_stsb_invariant_tag_amnt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
RevEng-23-24/Dataset48K | ---
dataset_info:
features:
- name: assembly
dtype: string
- name: c_source_code
dtype: string
splits:
- name: train
num_bytes: 54615105
num_examples: 30657
- name: val
num_bytes: 14102616
num_examples: 7665
- name: test
num_bytes: 17112427
num_examples: 9581
download_size: 24803662
dataset_size: 85830148
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
ronniewy/chinese_nli | ---
license: cc-by-4.0
---
|
kanxue/muep_cot_checkpoint | ---
license: apache-2.0
---
|
Circularmachines/batch_indexing_machine_224x224_images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1200580554.73
num_examples: 72510
download_size: 1200450555
dataset_size: 1200580554.73
---
# Dataset Card for "batch_indexing_machine_224x224_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sethapun/imdb_misspelled_30 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': neg
'1': pos
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 33632801
num_examples: 25000
- name: validation
num_bytes: 32851081
num_examples: 25000
download_size: 47443707
dataset_size: 66483882
---
# Dataset Card for "imdb_misspelled_30"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cynaptics/Test2Sql | ---
license: apache-2.0
dataset_info:
features:
- name: schema
dtype: string
- name: query
dtype: string
- name: question
dtype: string
splits:
- name: small_validation
num_bytes: 1171681
num_examples: 1000
- name: train
num_bytes: 572158109
num_examples: 489956
- name: small_train
num_bytes: 11702337
num_examples: 10000
- name: spider_test
num_bytes: 759534287
num_examples: 4840
- name: sql_eval_test
num_bytes: 18048597
num_examples: 3509
download_size: 150973260
dataset_size: 1362615011
configs:
- config_name: default
data_files:
- split: small_validation
path: data/small_validation-*
- split: train
path: data/train-*
- split: small_train
path: data/small_train-*
- split: spider_test
path: data/spider_test-*
- split: sql_eval_test
path: data/sql_eval_test-*
---
|
Rounak28/bengaliAI-preprocessed-whisper-medium-0-50000 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: split
dtype: string
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 48065858980
num_examples: 50000
download_size: 6861840289
dataset_size: 48065858980
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bengaliAI-preprocessed-whisper-medium-0-50000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falcon96/cidmoreira | ---
license: openrail
---
|
open-llm-leaderboard/details_feeltheAGI__Maverick-Math-7B | ---
pretty_name: Evaluation run of feeltheAGI/Maverick-Math-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [feeltheAGI/Maverick-Math-7B](https://huggingface.co/feeltheAGI/Maverick-Math-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_feeltheAGI__Maverick-Math-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-14T10:55:45.107983](https://huggingface.co/datasets/open-llm-leaderboard/details_feeltheAGI__Maverick-Math-7B/blob/main/results_2024-03-14T10-55-45.107983.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6296804564180761,\n\
\ \"acc_stderr\": 0.03245152465151836,\n \"acc_norm\": 0.6301031619229365,\n\
\ \"acc_norm_stderr\": 0.033115363797357654,\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.017087795881769625,\n \"mc2\": 0.5596682871042189,\n\
\ \"mc2_stderr\": 0.015333014253402569\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.0141696645203031,\n\
\ \"acc_norm\": 0.6527303754266212,\n \"acc_norm_stderr\": 0.013913034529620455\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6515634335789683,\n\
\ \"acc_stderr\": 0.004755013243022125,\n \"acc_norm\": 0.8454491137223661,\n\
\ \"acc_norm_stderr\": 0.0036073726062950976\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n\
\ \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n\
\ \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483016,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483016\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602357,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602357\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.02794045713622841,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.02794045713622841\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8293577981651377,\n \"acc_stderr\": 0.016129271025099864,\n \"\
acc_norm\": 0.8293577981651377,\n \"acc_norm_stderr\": 0.016129271025099864\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n\
\ \"acc_stderr\": 0.027599174300640763,\n \"acc_norm\": 0.8088235294117647,\n\
\ \"acc_norm_stderr\": 0.027599174300640763\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n\
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.02494679222527231,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.02494679222527231\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.01596103667523096,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.01596103667523096\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4602346805736636,\n\
\ \"acc_stderr\": 0.012729785386598559,\n \"acc_norm\": 0.4602346805736636,\n\
\ \"acc_norm_stderr\": 0.012729785386598559\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39167686658506734,\n\
\ \"mc1_stderr\": 0.017087795881769625,\n \"mc2\": 0.5596682871042189,\n\
\ \"mc2_stderr\": 0.015333014253402569\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936645\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6618650492797574,\n \
\ \"acc_stderr\": 0.013030829145172208\n }\n}\n```"
repo_url: https://huggingface.co/feeltheAGI/Maverick-Math-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|arc:challenge|25_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|gsm8k|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hellaswag|10_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T10-55-45.107983.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-14T10-55-45.107983.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- '**/details_harness|winogrande|5_2024-03-14T10-55-45.107983.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-14T10-55-45.107983.parquet'
- config_name: results
data_files:
- split: 2024_03_14T10_55_45.107983
path:
- results_2024-03-14T10-55-45.107983.parquet
- split: latest
path:
- results_2024-03-14T10-55-45.107983.parquet
---
# Dataset Card for Evaluation run of feeltheAGI/Maverick-Math-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [feeltheAGI/Maverick-Math-7B](https://huggingface.co/feeltheAGI/Maverick-Math-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_feeltheAGI__Maverick-Math-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-14T10:55:45.107983](https://huggingface.co/datasets/open-llm-leaderboard/details_feeltheAGI__Maverick-Math-7B/blob/main/results_2024-03-14T10-55-45.107983.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6296804564180761,
"acc_stderr": 0.03245152465151836,
"acc_norm": 0.6301031619229365,
"acc_norm_stderr": 0.033115363797357654,
"mc1": 0.39167686658506734,
"mc1_stderr": 0.017087795881769625,
"mc2": 0.5596682871042189,
"mc2_stderr": 0.015333014253402569
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.0141696645203031,
"acc_norm": 0.6527303754266212,
"acc_norm_stderr": 0.013913034529620455
},
"harness|hellaswag|10": {
"acc": 0.6515634335789683,
"acc_stderr": 0.004755013243022125,
"acc_norm": 0.8454491137223661,
"acc_norm_stderr": 0.0036073726062950976
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483016,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483016
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602357,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602357
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.02794045713622841,
"acc_norm": 0.3,
"acc_norm_stderr": 0.02794045713622841
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.016129271025099864,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.016129271025099864
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640763,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.02494679222527231,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.02494679222527231
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.01596103667523096,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.01596103667523096
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4602346805736636,
"acc_stderr": 0.012729785386598559,
"acc_norm": 0.4602346805736636,
"acc_norm_stderr": 0.012729785386598559
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39167686658506734,
"mc1_stderr": 0.017087795881769625,
"mc2": 0.5596682871042189,
"mc2_stderr": 0.015333014253402569
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936645
},
"harness|gsm8k|5": {
"acc": 0.6618650492797574,
"acc_stderr": 0.013030829145172208
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SkyWR/Pespi | ---
license: openrail
---
|
open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-merged | ---
pretty_name: Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-merged
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Korabbit/Llama-2-7b-chat-hf-afr-200step-merged](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-merged)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 1 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-merged\"\
,\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese\
\ are the [latest results from run 2023-12-02T13:52:27.757521](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-merged/blob/main/results_2023-12-02T13-52-27.757521.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.18953752843062927,\n\
\ \"acc_stderr\": 0.010795837931896377\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.18953752843062927,\n \"acc_stderr\": 0.010795837931896377\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-merged
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_02T13_52_27.757521
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-52-27.757521.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-02T13-52-27.757521.parquet'
- config_name: results
data_files:
- split: 2023_12_02T13_52_27.757521
path:
- results_2023-12-02T13-52-27.757521.parquet
- split: latest
path:
- results_2023-12-02T13-52-27.757521.parquet
---
# Dataset Card for Evaluation run of Korabbit/Llama-2-7b-chat-hf-afr-200step-merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Korabbit/Llama-2-7b-chat-hf-afr-200step-merged](https://huggingface.co/Korabbit/Llama-2-7b-chat-hf-afr-200step-merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 1 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-merged",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T13:52:27.757521](https://huggingface.co/datasets/open-llm-leaderboard/details_Korabbit__Llama-2-7b-chat-hf-afr-200step-merged/blob/main/results_2023-12-02T13-52-27.757521.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.18953752843062927,
"acc_stderr": 0.010795837931896377
},
"harness|gsm8k|5": {
"acc": 0.18953752843062927,
"acc_stderr": 0.010795837931896377
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
BangumiBase/myheroacademia | ---
license: mit
tags:
- art
size_categories:
- 10K<n<100K
---
# Bangumi Image Base of My Hero Academia
This is the image base of bangumi My Hero Academia, we detected 146 characters, 15676 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:----------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|
| 0 | 186 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 254 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 105 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 546 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 69 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 739 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 201 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 34 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 129 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 31 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 3091 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 71 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 174 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 211 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 878 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 130 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 75 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 63 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 69 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 540 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 274 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 286 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 140 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 25 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 81 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 30 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 60 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 26 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 41 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 52 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 81 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 26 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 86 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 65 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 32 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 41 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 74 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 259 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 26 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 11 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 26 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 78 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 29 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 24 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 41 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 24 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 450 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 21 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 43 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 75 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 85 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 85 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 41 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 20 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 79 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 20 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 117 | [Download](56/dataset.zip) |  |  |  |  |  |  |  |  |
| 57 | 21 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 159 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 41 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 14 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 32 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 29 | [Download](62/dataset.zip) |  |  |  |  |  |  |  |  |
| 63 | 388 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 42 | [Download](64/dataset.zip) |  |  |  |  |  |  |  |  |
| 65 | 124 | [Download](65/dataset.zip) |  |  |  |  |  |  |  |  |
| 66 | 24 | [Download](66/dataset.zip) |  |  |  |  |  |  |  |  |
| 67 | 25 | [Download](67/dataset.zip) |  |  |  |  |  |  |  |  |
| 68 | 20 | [Download](68/dataset.zip) |  |  |  |  |  |  |  |  |
| 69 | 45 | [Download](69/dataset.zip) |  |  |  |  |  |  |  |  |
| 70 | 20 | [Download](70/dataset.zip) |  |  |  |  |  |  |  |  |
| 71 | 21 | [Download](71/dataset.zip) |  |  |  |  |  |  |  |  |
| 72 | 31 | [Download](72/dataset.zip) |  |  |  |  |  |  |  |  |
| 73 | 26 | [Download](73/dataset.zip) |  |  |  |  |  |  |  |  |
| 74 | 14 | [Download](74/dataset.zip) |  |  |  |  |  |  |  |  |
| 75 | 33 | [Download](75/dataset.zip) |  |  |  |  |  |  |  |  |
| 76 | 21 | [Download](76/dataset.zip) |  |  |  |  |  |  |  |  |
| 77 | 9 | [Download](77/dataset.zip) |  |  |  |  |  |  |  |  |
| 78 | 17 | [Download](78/dataset.zip) |  |  |  |  |  |  |  |  |
| 79 | 32 | [Download](79/dataset.zip) |  |  |  |  |  |  |  |  |
| 80 | 24 | [Download](80/dataset.zip) |  |  |  |  |  |  |  |  |
| 81 | 26 | [Download](81/dataset.zip) |  |  |  |  |  |  |  |  |
| 82 | 21 | [Download](82/dataset.zip) |  |  |  |  |  |  |  |  |
| 83 | 18 | [Download](83/dataset.zip) |  |  |  |  |  |  |  |  |
| 84 | 331 | [Download](84/dataset.zip) |  |  |  |  |  |  |  |  |
| 85 | 12 | [Download](85/dataset.zip) |  |  |  |  |  |  |  |  |
| 86 | 20 | [Download](86/dataset.zip) |  |  |  |  |  |  |  |  |
| 87 | 18 | [Download](87/dataset.zip) |  |  |  |  |  |  |  |  |
| 88 | 101 | [Download](88/dataset.zip) |  |  |  |  |  |  |  |  |
| 89 | 813 | [Download](89/dataset.zip) |  |  |  |  |  |  |  |  |
| 90 | 19 | [Download](90/dataset.zip) |  |  |  |  |  |  |  |  |
| 91 | 34 | [Download](91/dataset.zip) |  |  |  |  |  |  |  |  |
| 92 | 13 | [Download](92/dataset.zip) |  |  |  |  |  |  |  |  |
| 93 | 11 | [Download](93/dataset.zip) |  |  |  |  |  |  |  |  |
| 94 | 13 | [Download](94/dataset.zip) |  |  |  |  |  |  |  |  |
| 95 | 11 | [Download](95/dataset.zip) |  |  |  |  |  |  |  |  |
| 96 | 33 | [Download](96/dataset.zip) |  |  |  |  |  |  |  |  |
| 97 | 17 | [Download](97/dataset.zip) |  |  |  |  |  |  |  |  |
| 98 | 21 | [Download](98/dataset.zip) |  |  |  |  |  |  |  |  |
| 99 | 25 | [Download](99/dataset.zip) |  |  |  |  |  |  |  |  |
| 100 | 25 | [Download](100/dataset.zip) |  |  |  |  |  |  |  |  |
| 101 | 16 | [Download](101/dataset.zip) |  |  |  |  |  |  |  |  |
| 102 | 34 | [Download](102/dataset.zip) |  |  |  |  |  |  |  |  |
| 103 | 52 | [Download](103/dataset.zip) |  |  |  |  |  |  |  |  |
| 104 | 199 | [Download](104/dataset.zip) |  |  |  |  |  |  |  |  |
| 105 | 34 | [Download](105/dataset.zip) |  |  |  |  |  |  |  |  |
| 106 | 36 | [Download](106/dataset.zip) |  |  |  |  |  |  |  |  |
| 107 | 11 | [Download](107/dataset.zip) |  |  |  |  |  |  |  |  |
| 108 | 31 | [Download](108/dataset.zip) |  |  |  |  |  |  |  |  |
| 109 | 19 | [Download](109/dataset.zip) |  |  |  |  |  |  |  |  |
| 110 | 22 | [Download](110/dataset.zip) |  |  |  |  |  |  |  |  |
| 111 | 7 | [Download](111/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 112 | 67 | [Download](112/dataset.zip) |  |  |  |  |  |  |  |  |
| 113 | 23 | [Download](113/dataset.zip) |  |  |  |  |  |  |  |  |
| 114 | 390 | [Download](114/dataset.zip) |  |  |  |  |  |  |  |  |
| 115 | 18 | [Download](115/dataset.zip) |  |  |  |  |  |  |  |  |
| 116 | 10 | [Download](116/dataset.zip) |  |  |  |  |  |  |  |  |
| 117 | 31 | [Download](117/dataset.zip) |  |  |  |  |  |  |  |  |
| 118 | 46 | [Download](118/dataset.zip) |  |  |  |  |  |  |  |  |
| 119 | 16 | [Download](119/dataset.zip) |  |  |  |  |  |  |  |  |
| 120 | 32 | [Download](120/dataset.zip) |  |  |  |  |  |  |  |  |
| 121 | 10 | [Download](121/dataset.zip) |  |  |  |  |  |  |  |  |
| 122 | 126 | [Download](122/dataset.zip) |  |  |  |  |  |  |  |  |
| 123 | 48 | [Download](123/dataset.zip) |  |  |  |  |  |  |  |  |
| 124 | 21 | [Download](124/dataset.zip) |  |  |  |  |  |  |  |  |
| 125 | 10 | [Download](125/dataset.zip) |  |  |  |  |  |  |  |  |
| 126 | 52 | [Download](126/dataset.zip) |  |  |  |  |  |  |  |  |
| 127 | 42 | [Download](127/dataset.zip) |  |  |  |  |  |  |  |  |
| 128 | 32 | [Download](128/dataset.zip) |  |  |  |  |  |  |  |  |
| 129 | 28 | [Download](129/dataset.zip) |  |  |  |  |  |  |  |  |
| 130 | 13 | [Download](130/dataset.zip) |  |  |  |  |  |  |  |  |
| 131 | 9 | [Download](131/dataset.zip) |  |  |  |  |  |  |  |  |
| 132 | 57 | [Download](132/dataset.zip) |  |  |  |  |  |  |  |  |
| 133 | 32 | [Download](133/dataset.zip) |  |  |  |  |  |  |  |  |
| 134 | 14 | [Download](134/dataset.zip) |  |  |  |  |  |  |  |  |
| 135 | 138 | [Download](135/dataset.zip) |  |  |  |  |  |  |  |  |
| 136 | 14 | [Download](136/dataset.zip) |  |  |  |  |  |  |  |  |
| 137 | 11 | [Download](137/dataset.zip) |  |  |  |  |  |  |  |  |
| 138 | 6 | [Download](138/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 139 | 33 | [Download](139/dataset.zip) |  |  |  |  |  |  |  |  |
| 140 | 7 | [Download](140/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 141 | 8 | [Download](141/dataset.zip) |  |  |  |  |  |  |  |  |
| 142 | 8 | [Download](142/dataset.zip) |  |  |  |  |  |  |  |  |
| 143 | 15 | [Download](143/dataset.zip) |  |  |  |  |  |  |  |  |
| 144 | 17 | [Download](144/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 467 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b-preview | ---
pretty_name: Evaluation run of cognitivecomputations/dolphin-2.8-experiment26-7b-preview
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [cognitivecomputations/dolphin-2.8-experiment26-7b-preview](https://huggingface.co/cognitivecomputations/dolphin-2.8-experiment26-7b-preview)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b-preview\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-03T20:15:12.993951](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b-preview/blob/main/results_2024-03-03T20-15-12.993951.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6353244379094956,\n\
\ \"acc_stderr\": 0.032491657533462066,\n \"acc_norm\": 0.6360998788498634,\n\
\ \"acc_norm_stderr\": 0.033161036274808056,\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5487265863315941,\n\
\ \"mc2_stderr\": 0.015174281776839011\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6143344709897611,\n \"acc_stderr\": 0.014224250973257186,\n\
\ \"acc_norm\": 0.6450511945392492,\n \"acc_norm_stderr\": 0.013983036904094094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6413065126468831,\n\
\ \"acc_stderr\": 0.004786368011500458,\n \"acc_norm\": 0.8378809002190799,\n\
\ \"acc_norm_stderr\": 0.00367806799442448\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283,\n \"\
acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.02489246917246283\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163046,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899133,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899133\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2916201117318436,\n\
\ \"acc_stderr\": 0.015201032512520437,\n \"acc_norm\": 0.2916201117318436,\n\
\ \"acc_norm_stderr\": 0.015201032512520437\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.012741974333897224,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.012741974333897224\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.037752516806863715,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.037752516806863715\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n\
\ \"mc1_stderr\": 0.016965517578930354,\n \"mc2\": 0.5487265863315941,\n\
\ \"mc2_stderr\": 0.015174281776839011\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305892\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6360879454131918,\n \
\ \"acc_stderr\": 0.013252539227966195\n }\n}\n```"
repo_url: https://huggingface.co/cognitivecomputations/dolphin-2.8-experiment26-7b-preview
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|arc:challenge|25_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|gsm8k|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hellaswag|10_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T20-15-12.993951.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-03T20-15-12.993951.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- '**/details_harness|winogrande|5_2024-03-03T20-15-12.993951.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-03T20-15-12.993951.parquet'
- config_name: results
data_files:
- split: 2024_03_03T20_15_12.993951
path:
- results_2024-03-03T20-15-12.993951.parquet
- split: latest
path:
- results_2024-03-03T20-15-12.993951.parquet
---
# Dataset Card for Evaluation run of cognitivecomputations/dolphin-2.8-experiment26-7b-preview
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [cognitivecomputations/dolphin-2.8-experiment26-7b-preview](https://huggingface.co/cognitivecomputations/dolphin-2.8-experiment26-7b-preview) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b-preview",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-03T20:15:12.993951](https://huggingface.co/datasets/open-llm-leaderboard/details_cognitivecomputations__dolphin-2.8-experiment26-7b-preview/blob/main/results_2024-03-03T20-15-12.993951.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6353244379094956,
"acc_stderr": 0.032491657533462066,
"acc_norm": 0.6360998788498634,
"acc_norm_stderr": 0.033161036274808056,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5487265863315941,
"mc2_stderr": 0.015174281776839011
},
"harness|arc:challenge|25": {
"acc": 0.6143344709897611,
"acc_stderr": 0.014224250973257186,
"acc_norm": 0.6450511945392492,
"acc_norm_stderr": 0.013983036904094094
},
"harness|hellaswag|10": {
"acc": 0.6413065126468831,
"acc_stderr": 0.004786368011500458,
"acc_norm": 0.8378809002190799,
"acc_norm_stderr": 0.00367806799442448
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163046,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899133,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899133
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2916201117318436,
"acc_stderr": 0.015201032512520437,
"acc_norm": 0.2916201117318436,
"acc_norm_stderr": 0.015201032512520437
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897224,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897224
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.037752516806863715,
"acc_norm": 0.83,
"acc_norm_stderr": 0.037752516806863715
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.016965517578930354,
"mc2": 0.5487265863315941,
"mc2_stderr": 0.015174281776839011
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305892
},
"harness|gsm8k|5": {
"acc": 0.6360879454131918,
"acc_stderr": 0.013252539227966195
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
vic0428/imdb-card-pred-decimal | ---
dataset_info:
features:
- name: text
dtype: string
- name: prompt
dtype: string
- name: true_cardinality
dtype: int64
splits:
- name: train
num_bytes: 39101954.4
num_examples: 80000
- name: test
num_bytes: 9775488.6
num_examples: 20000
download_size: 8384711
dataset_size: 48877443.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# Dataset Card for "imdb-card-pred-decimal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-html-90000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 666516
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
runes/3D | ---
license: cc
---
|
SUSTech/mt_bench | ---
dataset_info:
features:
- name: question_id
dtype: int64
- name: category
dtype: string
- name: turns
list:
- name: content
dtype: string
- name: role
dtype: string
- name: reference
sequence: string
splits:
- name: train
num_bytes: 46852
num_examples: 80
download_size: 31246
dataset_size: 46852
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mt_bench"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sid220/2713-2024-shot-prediction | ---
license: mit
---
|
ovior/twitter_dataset_1713229857 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2288515
num_examples: 7210
download_size: 1295624
dataset_size: 2288515
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MrOvkill/SVIG-v0.1 | ---
license: wtfpl
---
|
open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k | ---
pretty_name: Evaluation run of NurtureAI/Orca-2-7B-16k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NurtureAI/Orca-2-7B-16k](https://huggingface.co/NurtureAI/Orca-2-7B-16k) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-25T21:39:02.599324](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public/blob/main/results_2023-11-25T21-39-02.599324.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.36746546712957223,\n\
\ \"acc_stderr\": 0.033751277531008754,\n \"acc_norm\": 0.3738175555586316,\n\
\ \"acc_norm_stderr\": 0.03459812342976094,\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.45373679597767685,\n\
\ \"mc2_stderr\": 0.015753224924844992,\n \"em\": 0.21046560402684564,\n\
\ \"em_stderr\": 0.004174608410380015,\n \"f1\": 0.267364723154363,\n\
\ \"f1_stderr\": 0.004242093940617827\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4735494880546075,\n \"acc_stderr\": 0.014590931358120174,\n\
\ \"acc_norm\": 0.5059726962457338,\n \"acc_norm_stderr\": 0.014610348300255795\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.47410874327823144,\n\
\ \"acc_stderr\": 0.004983087049281742,\n \"acc_norm\": 0.6389165504879506,\n\
\ \"acc_norm_stderr\": 0.00479333052565621\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.41,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.39622641509433965,\n \"acc_stderr\": 0.030102793781791194,\n\
\ \"acc_norm\": 0.39622641509433965,\n \"acc_norm_stderr\": 0.030102793781791194\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.02951319662553935,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.02951319662553935\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948368,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948368\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n\
\ \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n\
\ \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.02786932057166463,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.02786932057166463\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.038881769216741004,\n\
\ \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.038881769216741004\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4292929292929293,\n \"acc_stderr\": 0.03526552724601198,\n \"\
acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.03526552724601198\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5284974093264249,\n \"acc_stderr\": 0.03602573571288441,\n\
\ \"acc_norm\": 0.5284974093264249,\n \"acc_norm_stderr\": 0.03602573571288441\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.024035489676335065,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.024035489676335065\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5229357798165137,\n \"acc_stderr\": 0.0214147570581755,\n \"acc_norm\"\
: 0.5229357798165137,\n \"acc_norm_stderr\": 0.0214147570581755\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03099866630456052,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03099866630456052\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.03492406104163613,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.03492406104163613\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.03184399873811226,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.03184399873811226\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4304932735426009,\n\
\ \"acc_stderr\": 0.033231973029429394,\n \"acc_norm\": 0.4304932735426009,\n\
\ \"acc_norm_stderr\": 0.033231973029429394\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4049586776859504,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04750077341199986,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04750077341199986\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3803680981595092,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.3803680981595092,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.36893203883495146,\n \"acc_stderr\": 0.047776151811567386,\n\
\ \"acc_norm\": 0.36893203883495146,\n \"acc_norm_stderr\": 0.047776151811567386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.43162393162393164,\n\
\ \"acc_stderr\": 0.0324483553531149,\n \"acc_norm\": 0.43162393162393164,\n\
\ \"acc_norm_stderr\": 0.0324483553531149\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.40485312899106,\n\
\ \"acc_stderr\": 0.017553246467720256,\n \"acc_norm\": 0.40485312899106,\n\
\ \"acc_norm_stderr\": 0.017553246467720256\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.026329813341946243,\n\
\ \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.026329813341946243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961464,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961464\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3954248366013072,\n \"acc_stderr\": 0.027996723180631438,\n\
\ \"acc_norm\": 0.3954248366013072,\n \"acc_norm_stderr\": 0.027996723180631438\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.36012861736334406,\n\
\ \"acc_stderr\": 0.027264297599804015,\n \"acc_norm\": 0.36012861736334406,\n\
\ \"acc_norm_stderr\": 0.027264297599804015\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.027513747284379424,\n\
\ \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.027513747284379424\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.02772498944950931,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.02772498944950931\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29921773142112124,\n\
\ \"acc_stderr\": 0.01169537463069603,\n \"acc_norm\": 0.29921773142112124,\n\
\ \"acc_norm_stderr\": 0.01169537463069603\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3897058823529412,\n \"acc_stderr\": 0.029624663581159696,\n\
\ \"acc_norm\": 0.3897058823529412,\n \"acc_norm_stderr\": 0.029624663581159696\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3349673202614379,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.3349673202614379,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.37272727272727274,\n\
\ \"acc_stderr\": 0.04631381319425463,\n \"acc_norm\": 0.37272727272727274,\n\
\ \"acc_norm_stderr\": 0.04631381319425463\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4204081632653061,\n \"acc_stderr\": 0.03160106993449604,\n\
\ \"acc_norm\": 0.4204081632653061,\n \"acc_norm_stderr\": 0.03160106993449604\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.472636815920398,\n\
\ \"acc_stderr\": 0.035302355173346824,\n \"acc_norm\": 0.472636815920398,\n\
\ \"acc_norm_stderr\": 0.035302355173346824\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.32748538011695905,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.32748538011695905,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28886168910648713,\n\
\ \"mc1_stderr\": 0.01586634640138431,\n \"mc2\": 0.45373679597767685,\n\
\ \"mc2_stderr\": 0.015753224924844992\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5422257300710339,\n \"acc_stderr\": 0.014002284504422435\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.21046560402684564,\n \
\ \"em_stderr\": 0.004174608410380015,\n \"f1\": 0.267364723154363,\n \
\ \"f1_stderr\": 0.004242093940617827\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.015163002274450341,\n \"acc_stderr\": 0.0033660229497263225\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NurtureAI/Orca-2-7B-16k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|arc:challenge|25_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|drop|3_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|gsm8k|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hellaswag|10_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-25T21-39-02.599324.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- '**/details_harness|winogrande|5_2023-11-25T21-39-02.599324.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-25T21-39-02.599324.parquet'
- config_name: results
data_files:
- split: 2023_11_25T21_39_02.599324
path:
- results_2023-11-25T21-39-02.599324.parquet
- split: latest
path:
- results_2023-11-25T21-39-02.599324.parquet
---
# Dataset Card for Evaluation run of NurtureAI/Orca-2-7B-16k
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NurtureAI/Orca-2-7B-16k
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NurtureAI/Orca-2-7B-16k](https://huggingface.co/NurtureAI/Orca-2-7B-16k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-25T21:39:02.599324](https://huggingface.co/datasets/open-llm-leaderboard/details_NurtureAI__Orca-2-7B-16k_public/blob/main/results_2023-11-25T21-39-02.599324.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.36746546712957223,
"acc_stderr": 0.033751277531008754,
"acc_norm": 0.3738175555586316,
"acc_norm_stderr": 0.03459812342976094,
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.45373679597767685,
"mc2_stderr": 0.015753224924844992,
"em": 0.21046560402684564,
"em_stderr": 0.004174608410380015,
"f1": 0.267364723154363,
"f1_stderr": 0.004242093940617827
},
"harness|arc:challenge|25": {
"acc": 0.4735494880546075,
"acc_stderr": 0.014590931358120174,
"acc_norm": 0.5059726962457338,
"acc_norm_stderr": 0.014610348300255795
},
"harness|hellaswag|10": {
"acc": 0.47410874327823144,
"acc_stderr": 0.004983087049281742,
"acc_norm": 0.6389165504879506,
"acc_norm_stderr": 0.00479333052565621
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.39622641509433965,
"acc_stderr": 0.030102793781791194,
"acc_norm": 0.39622641509433965,
"acc_norm_stderr": 0.030102793781791194
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.02951319662553935,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.02951319662553935
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948368,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948368
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4,
"acc_stderr": 0.02786932057166463,
"acc_norm": 0.4,
"acc_norm_stderr": 0.02786932057166463
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.038881769216741004,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.038881769216741004
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.03526552724601198,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.03526552724601198
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5284974093264249,
"acc_stderr": 0.03602573571288441,
"acc_norm": 0.5284974093264249,
"acc_norm_stderr": 0.03602573571288441
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.024035489676335065,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.024035489676335065
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5229357798165137,
"acc_stderr": 0.0214147570581755,
"acc_norm": 0.5229357798165137,
"acc_norm_stderr": 0.0214147570581755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03099866630456052,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03099866630456052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.03492406104163613,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.03492406104163613
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.03184399873811226,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.03184399873811226
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4304932735426009,
"acc_stderr": 0.033231973029429394,
"acc_norm": 0.4304932735426009,
"acc_norm_stderr": 0.033231973029429394
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4049586776859504,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.4049586776859504,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04750077341199986,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04750077341199986
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3803680981595092,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.3803680981595092,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.36893203883495146,
"acc_stderr": 0.047776151811567386,
"acc_norm": 0.36893203883495146,
"acc_norm_stderr": 0.047776151811567386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.43162393162393164,
"acc_stderr": 0.0324483553531149,
"acc_norm": 0.43162393162393164,
"acc_norm_stderr": 0.0324483553531149
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.40485312899106,
"acc_stderr": 0.017553246467720256,
"acc_norm": 0.40485312899106,
"acc_norm_stderr": 0.017553246467720256
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961464,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961464
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3954248366013072,
"acc_stderr": 0.027996723180631438,
"acc_norm": 0.3954248366013072,
"acc_norm_stderr": 0.027996723180631438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.36012861736334406,
"acc_stderr": 0.027264297599804015,
"acc_norm": 0.36012861736334406,
"acc_norm_stderr": 0.027264297599804015
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.02772498944950931,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.02772498944950931
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29921773142112124,
"acc_stderr": 0.01169537463069603,
"acc_norm": 0.29921773142112124,
"acc_norm_stderr": 0.01169537463069603
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3897058823529412,
"acc_stderr": 0.029624663581159696,
"acc_norm": 0.3897058823529412,
"acc_norm_stderr": 0.029624663581159696
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3349673202614379,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.3349673202614379,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425463,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425463
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4204081632653061,
"acc_stderr": 0.03160106993449604,
"acc_norm": 0.4204081632653061,
"acc_norm_stderr": 0.03160106993449604
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.472636815920398,
"acc_stderr": 0.035302355173346824,
"acc_norm": 0.472636815920398,
"acc_norm_stderr": 0.035302355173346824
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.32748538011695905,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.32748538011695905,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28886168910648713,
"mc1_stderr": 0.01586634640138431,
"mc2": 0.45373679597767685,
"mc2_stderr": 0.015753224924844992
},
"harness|winogrande|5": {
"acc": 0.5422257300710339,
"acc_stderr": 0.014002284504422435
},
"harness|drop|3": {
"em": 0.21046560402684564,
"em_stderr": 0.004174608410380015,
"f1": 0.267364723154363,
"f1_stderr": 0.004242093940617827
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.0033660229497263225
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
andstor/smart_contracts | ---
annotations_creators: []
language_creators: []
language:
- en
multilinguality:
- monolingual
pretty_name: Smart Contracts
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-generation
task_ids:
- language-modeling
paperswithcode_id: verified-smart-contracts
---
# Dataset Card for Smart Contracts
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [flattened](#flattened)
- [flattened_plain_text](#flattened_plain_text)
- [inflated](#inflated)
- [inflated_plain_text](#inflated_plain_text)
- [parsed](#parsed)
- [Data Fields](#data-fields)
- [flattened](#flattened-1)
- [flattened_plain_text](#flattened_plain_text-1)
- [inflated](#inflated-1)
- [inflated_plain_text](#inflated_plain_text-1)
- [parsed](#parsed-1)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://andstor.github.io/smart-contracts
- **Repository:** https://github.com/andstor/verified-smart-contracts
- **Paper:**
- **Leaderboard:**
- **Point of Contact:** [André Storhaug](mailto:andr3.storhaug@gmail.com)
### Dataset Summary
This is a dataset of verified Smart Contracts from Etherscan.io that are deployed to the Ethereum blockchain. A set of about 100,000 to 200,000 contracts are provided, containing both Solidity and Vyper code.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
#### flattened
```
{
'contract_name': 'MiaKhalifaDAO',
'contract_address': '0xb3862ca215d5ed2de22734ed001d701adf0a30b4',
'language': 'Solidity',
'source_code': '// File: @openzeppelin/contracts/utils/Strings.sol\r\n\r\n\r\n// OpenZeppelin Contracts v4.4.1 (utils/Strings.sol)\r\n\r\npragma solidity ^0.8.0;\r\n\r\n/**\r\n * @dev String operations.\r\n */\r\nlibrary Strings {\r\n...',
'abi': '[{"inputs":[{"internalType":"uint256","name":"maxBatchSize_","type":"uint256"}...]',
'compiler_version': 'v0.8.7+commit.e28d00a7',
'optimization_used': False,
'runs': 200,
'constructor_arguments': '000000000000000000000000000000000000000000000000000000000000000a000...',
'evm_version': 'Default',
'library': '',
'license_type': 'MIT',
'proxy': False,
'implementation': '',
'swarm_source': 'ipfs://e490df69bd9ca50e1831a1ac82177e826fee459b0b085a00bd7a727c80d74089'
}
```
#### flattened_extended
Same fields as `flattened` but with the following additional fields:
```
{
...
'tx_count': 1074,
'balance': 38
}
```
#### flattened_plain_text
```
{
'language': 'Solidity',
'text': '// File: SafeMath.sol\r\npragma solidity =0.5.16;\r\n\r\n// a library for performing overflow-safe math...'
}
```
#### inflated
```
{
'contract_name': 'PinkLemonade',
'file_path': 'PinkLemonade.sol',
'contract_address': '0x9a5be3cc368f01a0566a613aad7183783cff7eec',
'language': 'Solidity',
'source_code': '/**\r\n\r\nt.me/pinklemonadecoin\r\n*/\r\n\r\n// SPDX-License-Identifier: MIT\r\npragma solidity ^0.8.0;\r\n\r\n\r\n/*\r\n * @dev Provides information about the current execution context, including the\r\n * sender of the transaction and its data. While these are generally available...',
'abi': '[{"inputs":[],"stateMutability":"nonpayable","type":"constructor"}...]',
'compiler_version': 'v0.8.4+commit.c7e474f2',
'optimization_used': False,
'runs': 200,
'constructor_arguments': '',
'evm_version': 'Default',
'library': '',
'license_type': 'MIT',
'proxy': False,
'implementation': '',
'swarm_source': 'ipfs://eb0ac9491a04e7a196280fd27ce355a85d79b34c7b0a83ab606d27972a06050c'
}
```
#### inflated_plain_text
```
{
'language': 'Solidity',
'text': '\\npragma solidity ^0.4.11;\\n\\ncontract ERC721 {\\n // Required methods\\n function totalSupply() public view returns (uint256 total);...'
}
```
#### parsed
```
{
'contract_name': 'BondedECDSAKeep',
'file_path': '@keep-network/keep-core/contracts/StakeDelegatable.sol',
'contract_address': '0x61935dc4ffc5c5f1d141ac060c0eef04a792d8ee',
'language': 'Solidity',
'class_name': 'StakeDelegatable',
'class_code': 'contract StakeDelegatable {\n using OperatorParams for uint256;\n\n mapping(address => Operator) internal operators;\n\n struct Operator {\n uint256 packedParams;\n address owner;\n address payable beneficiary;\n address authorizer;\n }\n\n...',
'class_documentation': '/// @title Stake Delegatable\n/// @notice A base contract to allow stake delegation for staking contracts.',
'class_documentation_type': 'NatSpecSingleLine',
'func_name': 'balanceOf',
'func_code': 'function balanceOf(address _address) public view returns (uint256 balance) {\n return operators[_address].packedParams.getAmount();\n }',
'func_documentation': '/// @notice Gets the stake balance of the specified address.\n/// @param _address The address to query the balance of.\n/// @return An uint256 representing the amount staked by the passed address.',
'func_documentation_type': 'NatSpecSingleLine',
'compiler_version': 'v0.5.17+commit.d19bba13',
'license_type': 'MIT',
'swarm_source': 'bzzr://63a152bdeccda501f3e5b77f97918c5500bb7ae07637beba7fae76dbe818bda4'
}
```
### Data Fields
#### flattened
- `contract_name` (`string`): containing the smart contract name.
- `contract_address` (`string`): containing the Ethereum address for the smart contract.
- `language` (`string`): containing the language of the smart contract.
- `source_code ` (`string`): containing the source code of the smart contract. This contains all code needed for compilation of the contract, including libraries.
- `abi` (`string`): containing the Application Binary Interface (ABI) of the smart contract.
- `compiler_version` (`string`): containing the compiler version used to compile the smart contract.
- `optimization_used` (`boolean`): indicating if the smart contract used optimization.
- `runs` (`number`): containing the number of optimization steps used.
- `constructor_arguments` (`string`): containing the constructor arguments of the smart contract.
- `evm_version` (`string`): containing the EVM version used to compile the smart contract.
- `library` (`string`): containing the `name:address` of libraries used separated by `;`.
- `license_type` (`string`): containing the license type of the smart contract.
- `proxy` (`boolean`): indicating if the smart contract is a proxy.
- `implementation` (`string`): containing the implementation of the smart contract if it is a proxy.
- `swarm_source` (`string`): containing the swarm source of the smart contract.
#### flattened_extended
Same fields as `flattened` but with the following additional fields:
- `tx_count` (`number`): containing the number of transactions made to the smart contract.
- `balance` (`string`): containing the ether balancce of the smart contract.
#### flattened_plain_text
- `text` (`string`): containing the source code of the smart contract. This contains all code needed for compilation of the contract, including libraries.
- `language` (`string`): containing the language of the smart contract.
#### inflated
Same fields as `flattened` but with the following additional fields:
- `file_path` (`string`): containing the original path to the file.
#### inflated_plain_text
- `text` (`string`): containing the source code of the smart contract. This contains all code needed for compilation of the contract, including libraries.
- `language` (`string`): containing the language of the smart contract.
#### parsed
- `contract_name` (`string`): containing the smart contract name.
- `file_path` (`string`): containing the original path to the file.
- `contract_address` (`string`): containing the Ethereum address for the smart contract.
- `language` (`string`): containing the language of the smart contract.
- `class_name` (`string`): containing the name of the "class" (contract).
- `class_code` (`string`): containing the source code of the "class" (contract).
- `class_documentation` (`string`): containing the documentation (code comment) of the "class" (contract).
- `class_documentation_type` (`string`): containing the documentation type of the "class" (contract). Can be one of: `NatSpecMultiLine`, `NatSpecSingleLine`, `LineComment` or `Comment`.
- `func_name` (`string`): containing the name of the function definition.
- `func_code` (`string`): containing the source code of the function.
- `func_documentation` (`string`): containing the documentation (code comment) of the contract definition (or "class").
- `func_documentation_type` (`string`): containing the documentation type of the function. Can be one of: `NatSpecMultiLine`, `NatSpecSingleLine`, `LineComment` or `Comment`.
- `compiler_version` (`string`): containing the compiler version used to compile the smart contract.
- `license_type` (`string`): containing the license type of the smart contract.
- `swarm_source` (`string`): containing the swarm source of the smart contract.
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```bibtex
@misc{storhaug2023efficient,
title={Efficient Avoidance of Vulnerabilities in Auto-completed Smart Contract Code Using Vulnerability-constrained Decoding},
author={André Storhaug and Jingyue Li and Tianyuan Hu},
year={2023},
eprint={2309.09826},
archivePrefix={arXiv},
primaryClass={cs.CR}
}
```
### Contributions
Thanks to [@andstor](https://github.com/andstor) for adding this dataset. |
leey4n/KR3 | ---
annotations_creators: []
language_creators: []
language:
- ko
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
pretty_name: KR3
size_categories:
- 100K<n<1m
source_datasets: []
task_categories:
- text-classification
task_ids:
- sentiment-classification
---
### KR3: Korean Restaurant Reviews with Ratings
Korean sentiment classification dataset
- Size: 460K(+180K)
- Language: Korean-centric
### ⚠️ Caution with `Rating` Column
0 stands for negative review, 1 stands for positive review, and 2 stands for ambiguous review.
**Note that rating 2 is not intended to be used directly for supervised learning(classification).** This data is included for additional pre-training purpose or other usage.
In other words, this dataset is basically a **binary** sentiment classification task where labels are 0 and 1.
### 🔍 See More
See all the codes for crawling/preprocessing the dataset and experiments with KR3 in [GitHub Repo](https://github.com/Wittgensteinian/kr3).
See Kaggle dataset in [Kaggle Dataset](https://www.kaggle.com/ninetyninenewton/kr3-korean-restaurant-reviews-with-ratings).
### Usage
```python
from datasets import load_dataset
kr3 = load_dataset("leey4n/KR3", name='kr3', split='train')
kr3 = kr3.remove_columns(['__index_level_0__']) # Original file didn't include this column. Suspect it's a hugging face issue.
```
```python
# drop reviews with ambiguous label
kr3_binary = kr3.filter(lambda example: example['Rating'] != 2)
```
### License
**CC BY-NC-SA 4.0**
### Legal Issues
We concluded that the **non-commerical usage and release of KR3 fall into the range of fair use (공정 이용)** stated in the Korean copyright act (저작권법). We further clarify that we **did not agree to the terms of service** from any websites which might prohibit web crawling. In other words, web crawling we've done was proceeded without logging in to the website. Despite all of these, feel free to contact to any of the contributors if you notice any legal issues.
### Contributors & Acknowledgement
(Alphabetical order)
[Dongin Jung](https://github.com/dongin1009)
[Hyunwoo Kwak](https://github.com/Kwak-Hyun-woo)
[Kaeun Lee](https://github.com/Kaeun-Lee)
[Yejoon Lee](https://github.com/wittgensteinian)
This work was done as DIYA 4기. Compute resources needed for the work was supported by [DIYA](https://blog.diyaml.com) and surromind.ai.
|
pattern90/test4 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 380878.0
num_examples: 6
download_size: 80720
dataset_size: 380878.0
---
# Dataset Card for "oct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/KTO-PRM | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 379985094.2914787
num_examples: 473458
download_size: 70204316
dataset_size: 379985094.2914787
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ai4bharat/IndicCOPA | ---
annotations_creators:
- expert-generated
language:
- as
- bn
- en
- gom
- gu
- hi
- kn
- mai
- ml
- mr
- ne
- or
- pa
- sa
- sat
- sd
- ta
- te
- ur
language_creators:
- expert-generated
license:
- cc-by-4.0
multilinguality:
- multilingual
pretty_name: IndicXCOPA
size_categories:
- 1K<n<10K
source_datasets:
- extended|xcopa
tags: []
task_categories:
- multiple-choice
task_ids:
- multiple-choice-qa
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
deven367/babylm-100M-children-stories | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 17676869
num_examples: 76758
- name: valid
num_bytes: 1425137
num_examples: 5996
- name: test
num_bytes: 1804421
num_examples: 7959
download_size: 12749002
dataset_size: 20906427
---
# Dataset Card for "babylm-100M-children-stories"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pkufool/librilight-text | ---
license: apache-2.0
---
|
TheDKBR/thedkbr | ---
license: openrail
---
|
ramixpe/sp_llama_simple | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 10031378
num_examples: 20551
download_size: 2324224
dataset_size: 10031378
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
saurabhRaj11/MobiusBpmndataset | ---
license: mit
---
|
KaleidoSG/Deepmind | ---
license: other
language:
- en
pretty_name: Deepmind
size_categories:
- 1M<n<10M
configs:
- config_name: default
data_files:
- split: train
path:
- train/OpenOrca/*.csv
- train/dolphin/*.csv
- train/flan_zsnoopt_data/*.csv
- train/t0_zsnoopt_data/*.csv
---
# Deepmind Dataset
## Overview
The Deepmind dataset is a curated collection of high-quality datasets meticulously selected to suit a wide range of research and application needs. These datasets have been chosen for their relevance, diversity, and overall data quality. The Deepmind dataset is provided in the Stanford Alpaca format, ensuring consistency and ease of use across various projects and applications.
# License
The Deepmind dataset is made available under the Apache License 2.0, which allows for flexible usage, modification, and distribution while maintaining attribution to the original data sources.
# Dataset Details
Here's a snapshot of some of the top datasets included in the Deepmind collection:
| Dataset | Files | Size |
|---------------|-------|---------|
| Open_Orca | 2 | 6.9 GB |
| Dolphin | 2 | 5.7 GB |
| FLAN | 78 | 15.1 GB |
| t0 | 124 | 25.9 GB |
| **Total** | 206 | 53.6 GB |
The datasets in the Deepmind collection span a diverse array of domains and are carefully selected for their high quality. These datasets cater to various research and application needs, each offering unique insights and applications.
# Citation
```
Deepmind Dataset.
2023
Deepmind
Retrieved from huggingface.co/datasets/NewstaR/Deepmind
Apache 2.0 License
``` |
Chaoticka/test | ---
license: artistic-2.0
tags:
- art
pretty_name: Chaos Doll
--- |
huggingartists/ghostmane | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/ghostmane"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.027776 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://assets.genius.com/images/default_avatar_300.png?1631290285')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/ghostmane">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ghostmane</div>
<a href="https://genius.com/artists/ghostmane">
<div style="text-align: center; font-size: 14px;">@ghostmane</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/ghostmane).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ghostmane")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|2| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/ghostmane")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
suwitlam/whisper-sun-system-dataset-sentence | ---
license: cc0-1.0
---
|
Tamnemtf/hcmue-qa-1 | ---
license: llama2
---
|
jfloresf/burn-cems-1 | ---
license: cc-by-4.0
---
|
FaalSa/dfaas3 | ---
dataset_info:
features:
- name: start
dtype: timestamp[s]
- name: target
sequence: float32
- name: item_id
dtype: string
- name: feat_static_cat
sequence: uint64
splits:
- name: train
num_bytes: 57633
num_examples: 1
- name: validation
num_bytes: 58113
num_examples: 1
- name: test
num_bytes: 58593
num_examples: 1
download_size: 35533
dataset_size: 174339
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
awacke1/eCQM-Code-Value-Semantic-Set.csv | ---
license: mit
---
eCQM-Code-Value-Semantic-Set.csv |
GDGiangi/SEIRDB | ---
language:
- en
- fr
- it
- el
- es
- ru
pretty_name: SEIRDB
size_categories:
- 100K<n<1M
task_categories:
- audio-classification
extra_gated_prompt: "To obtain an access token, the database licence must be purchased through https://gabegiangi.wordpress.com/2023/05/15/seir-db/"
extra_gated_fields:
Name: text
Email: text
Company: text
Country: text
Access Token: text
I agree not to give access to any other entities: checkbox
---
# Speech Emotion Intensity Recognition Database (SEIR-DB)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact: gabegiangi@gmail.com**
### Dataset Summary
The SEIR-DB is a comprehensive, multilingual speech emotion intensity recognition dataset containing over 600,000 instances from various sources. It is designed to support tasks related to speech emotion recognition and emotion intensity estimation. The database includes languages such as English, Russian, Mandarin, Greek, Italian, and French.
### Supported Tasks and Leaderboards
The SEIR dataset is suitable for speech emotion recognition and speech emotion intensity estimation tasks (a subset of the dataset).
### Languages
SEIR-DB encompasses multilingual data, featuring languages such as English, Russian, Mandarin, Greek, Italian, and French.
## Dataset Structure
### Data Instances
The raw data collection comprises over 600,000 data instances (375 hours). Users of the database can access the raw audio data, which is stored in subdirectories of the data directory (in their respective datasets).
After processing, cleaning, and formatting, the dataset contains approximately 120,000 training instances with an average audio utterance length of 3.8 seconds.
### Data Fields
- ID: unique sample identifier
- WAV: path to the audio file, located in the data directory
- EMOTION: annotated emotion
- INTENSITY: annotated intensity (ranging from 1-5), where 1 denotes low intensity, and 5 signifies high intensity; 0 indicates no annotation
- LENGTH: duration of the audio utterance
### Data Splits
The data is divided into train, test, and validation sets, located in the respective JSON manifest files.
- Train: 80%
- Validation: 10%
- Test: 10%
For added flexibility, unsplit data is also available in data.csv to allow custom splits.
## Dataset Creation
### Curation Rationale
The SEIR-DB was curated to maximize the volume of data instances, addressing a significant limitation in speech emotion recognition (SER) experimentation—the lack of emotion data and the small size of available datasets. This database aims to resolve these issues by providing a large volume of emotion-annotated data that is cleanly formatted for experimentation.
### Source Data
The dataset was compiled from various sources.
### Annotations
#### Annotation process
For details on the annotation process, please refer to the source for each dataset, as they were conducted differently. However, the entire database is human-annotated.
#### Who are the annotators?
Please consult the source documentation for information on the annotators.
### Personal and Sensitive Information
No attempt was made to remove personal and sensitive information, as consent and recordings were not obtained internally.
## Considerations for Using the Data
### Social Impact of Dataset
The SEIR-DB dataset can significantly impact the research and development of speech emotion recognition technologies by providing a large volume of annotated data. These technologies have the potential to enhance various applications, such as mental health monitoring, virtual assistants, customer support, and communication devices for people with disabilities.
### Discussion of Biases
During the dataset cleaning process, efforts were made to balance the database concerning the number of samples for each dataset, emotion distribution (with a greater focus on primary emotions and less on secondary emotions), and language distribution. However, biases may still be present.
### Other Known Limitations
No specific limitations have been identified at this time.
## Additional Information
### Dataset Curators
Gabriel Giangi - Concordia University - Montreal, QC Canada - gabegiangi@gmail.com
### Licensing Information
This dataset can be used for research and academic purposes. For commercial purposes, please contact gabegiangi@gmail.com .
### Citation Information
Aljuhani, R. H., Alshutayri, A., & Alahdal, S. (2021). Arabic speech emotion recognition from Saudi dialect corpus. IEEE Access, 9, 127081-127085.
Basu, S., Chakraborty, J., & Aftabuddin, M. (2017). Emotion recognition from speech using convolutional neural network with recurrent neural network architecture. In ICCES.
Baevski, A., Zhou, H. H., & Collobert, R. (2020). Wav2vec 2.0: A framework for self-supervised learning of speech representations. In NeurIPS.
Busso, C., Bulut, M., Lee, C. C., Kazemzadeh, A., Mower, E., Kim, S., ... & Narayanan, S. (2008). Iemocap: Interactive emotional dyadic motion capture database. In LREC.
Cao, H., Cooper, D.G., Keutmann, M.K., Gur, R.C., Nenkova, A., & Verma, R. (2014). CREMA-D: Crowd-Sourced Emotional Multimodal Actors Dataset. IEEE Transactions on Affective Computing, 5, 377-390.
Chopra, S., Mathur, P., Sawhney, R., & Shah, R. R. (2021). Meta-Learning for Low-Resource Speech Emotion Recognition. In ICASSP.
Costantini, G., Iaderola, I., Paoloni, A., & Todisco, M. (2014). EMOVO Corpus: an Italian Emotional Speech Database. In Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC'14) (pp. 3501-3504). European Language Resources Association (ELRA). Reykjavik, Iceland. http://www.lrec-conf.org/proceedings/lrec2014/pdf/591_Paper.pdf
Duville, Mathilde Marie; Alonso-Valerdi, Luz María; Ibarra-Zarate, David I. (2022), “Mexican Emotional Speech Database (MESD)”, Mendeley Data, V5, doi: 10.17632/cy34mh68j9.5
Gournay, Philippe, Lahaie, Olivier, & Lefebvre, Roch. (2018). A Canadian French Emotional Speech Dataset (1.1) [Data set]. ACM Multimedia Systems Conference (MMSys 2018) (MMSys'18), Amsterdam, The Netherlands. Zenodo. https://doi.org/10.5281/zenodo.1478765
Kandali, A., Routray, A., & Basu, T. (2008). Emotion recognition from Assamese speeches using MFCC features and GMM classifier. In TENCON.
Kondratenko, V., Sokolov, A., Karpov, N., Kutuzov, O., Savushkin, N., & Minkin, F. (2022). Large Raw Emotional Dataset with Aggregation Mechanism. arXiv preprint arXiv:2212.12266.
Kwon, S. (2021). MLT-DNet: Speech emotion recognition using 1D dilated CNN based on multi-learning trick approach. Expert Systems with Applications, 167, 114177.
Lee, Y., Lee, J. W., & Kim, S. (2019). Emotion recognition using convolutional neural network and multiple feature fusion. In ICASSP.
Li, Y., Baidoo, C., Cai, T., & Kusi, G. A. (2019). Speech emotion recognition using 1d cnn with no attention. In ICSEC.
Lian, Z., Tao, J., Liu, B., Huang, J., Yang, Z., & Li, R. (2020). Context-Dependent Domain Adversarial Neural Network for Multimodal Emotion Recognition. In Interspeech.
Livingstone, S. R., & Russo, F. A. (2018). The Ryerson audio-visual database of emotional speech and song (RAVDESS): A dynamic, multimodal set of facial and vocal expressions in North American English. PLoS ONE, 13(5), e0196391.
Peng, Z., Li, X., Zhu, Z., Unoki, M., Dang, J., & Akagi, M. (2020). Speech emotion recognition using 3d convolutions and attention-based sliding recurrent networks with auditory front-ends. IEEE Access, 8, 16560-16572.
Poria, S., Hazarika, D., Majumder, N., Naik, G., Cambria, E., & Mihalcea, R. (2019). Meld: A multimodal multi-party dataset for emotion recognition in conversations. In ACL.
Schneider, A., Baevski, A., & Collobert, R. (2019). Wav2vec: Unsupervised pre-training for speech recognition. In ICLR.
Schuller, B., Rigoll, G., & Lang, M. (2010). Speech emotion recognition: Features and classification models. In Interspeech.
Sinnott, R. O., Radulescu, A., & Kousidis, S. (2013). Surrey audiovisual expressed emotion (savee) database. In AVEC.
Vryzas, N., Kotsakis, R., Liatsou, A., Dimoulas, C. A., & Kalliris, G. (2018). Speech emotion recognition for performance interaction. Journal of the Audio Engineering Society, 66(6), 457-467.
Vryzas, N., Matsiola, M., Kotsakis, R., Dimoulas, C., & Kalliris, G. (2018, September). Subjective Evaluation of a Speech Emotion Recognition Interaction Framework. In Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion (p. 34). ACM.
Wang, Y., Yang, Y., Liu, Y., Chen, Y., Han, N., & Zhou, J. (2019). Speech emotion recognition using a combination of cnn and rnn. In Interspeech.
Yoon, S., Byun, S., & Jung, K. (2018). Multimodal speech emotion recognition using audio and text. In SLT.
Zhang, R., & Liu, M. (2020). Speech emotion recognition with self-attention. In ACL.
### Contributions
Gabriel Giangi - Concordia University - Montreal, QC Canada - gabegiangi@gmail.com |
acloudfan/hindi-to-english-translate | ---
license: apache-2.0
---
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_265 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 948318804.0
num_examples: 186237
download_size: 967011151
dataset_size: 948318804.0
---
# Dataset Card for "chunk_265"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deokhk/en_wiki_sentences_1000000 | ---
dataset_info:
features:
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 124980032
num_examples: 1000000
- name: dev
num_bytes: 123586
num_examples: 1000
download_size: 77463265
dataset_size: 125103618
---
# Dataset Card for "en_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
TurkuNLP/genre-6 | ---
task_categories:
- text-classification
language:
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
## Dataset Summary
Genre-6 dataset is an English dataset based on Kindletrends (UK & US). It contains more than 20k books and associated categories with ready-made binary classification and multilabel classification labels.
## Dataset Structure
### Data Instances
`` {"text": "...", "categories": "Engineering & Transportation;Science & Math", "fiction": "non-fiction", "split1": ['Science & Math'], "split2" : ['Engineering & Transportation', 'Science & Math'], "split3": ['Science & Math']} ``
### Data Fields
- text: Kindletrends text
- categories: Kidletrends categories (1 to 2 categories per book)
- fiction: binary label for fiction and non-fiction books
- splits 1,2,3: multilabel for different subsets of the categories
### Data Splits
The dataset contains train (80%), validation (10%) and test (10%) splits.
The splits for multilabels are following:
- split1: 'Biology & Nature & Biological Sciences','Computer Science', 'Fantasy','Medicine & Health Sciences','Philosophy','Science & Math'.
- split2: 'Biology & Nature & Biological Sciences','Computer Science', 'Engineering & Transportation','Fantasy','Medicine & Health Sciences','Science & Math'.
- split3: 'Biology & Nature & Biological Sciences','Computer Science', 'Fantasy','Medicine & Health Sciences', 'Poetry', 'Politics & Social Sciences', 'Science & Math'.
More splits can be generated from the field "categories".
### Source Data
[Kindletrends](https://kindletrends.com/categories/)
|
ebylmz/architects | ---
license: mit
---
|
LLMao/2024_03_10_03_32_56_Archive | ---
dataset_info:
features:
- name: page_content
dtype: string
- name: metadata
struct:
- name: source
dtype: string
- name: page
dtype: int64
splits:
- name: train
num_bytes: 2578897
num_examples: 7
download_size: 1430198
dataset_size: 2578897
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hongboyang/CMRC2018_instruction1 | ---
dataset_info:
features:
- name: INPUT
dtype: string
- name: TARGET
dtype: string
splits:
- name: train
num_bytes: 17133521
num_examples: 10142
download_size: 4142597
dataset_size: 17133521
---
# Dataset Card for "CMRC2018_instruction1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ghbacct/financial-phrasebank-all-agree-clustering | ---
dataset_info:
features:
- name: sentences
sequence: string
- name: labels
sequence: int64
splits:
- name: test
num_bytes: 303379
num_examples: 1
download_size: 166862
dataset_size: 303379
---
# Dataset Card for "financial-phrasebank-all-agree-clustering"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Danieldlima21/mcmelody | ---
license: openrail
---
|
zolak/twitter_dataset_81_1713085248 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2664104
num_examples: 6535
download_size: 1316393
dataset_size: 2664104
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
serhii-korobchenko/github-issues-embeddings | ---
dataset_info:
features:
- name: html_url
dtype: string
- name: comments
dtype: string
- name: title
dtype: string
- name: body
dtype: string
- name: comment_length
dtype: int64
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 44924892
num_examples: 5034
download_size: 23623074
dataset_size: 44924892
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HamdanXI/paradetox-split | ---
dataset_info:
features:
- name: en_toxic_comment
dtype: string
- name: en_neutral_comment
dtype: string
splits:
- name: train
num_bytes: 2082140.019398298
num_examples: 19073
- name: test
num_bytes: 78390
num_examples: 671
download_size: 1237763
dataset_size: 2160530.019398298
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
eduzon/joaopauloo | ---
license: openrail
---
|
Sachin7/story_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: 'Unnamed: 1'
dtype: float64
splits:
- name: train
num_bytes: 143265.9775280899
num_examples: 62
- name: test
num_bytes: 62390.02247191011
num_examples: 27
download_size: 145722
dataset_size: 205656.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
qnguyen3/alapaca-vi | ---
license: mit
---
|
long292/PADNCH_5 | ---
dataset_info:
features:
- name: Phiên âm
dtype: string
- name: Dịch nghĩa
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1579554
num_examples: 7406
download_size: 925406
dataset_size: 1579554
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
InceptiveDev/job_title | ---
license: mit
---
|
LEAP/ClimSim_low-res | ---
license: cc-by-4.0
---
Corresponding GitHub repo can be found here:
https://github.com/leap-stc/ClimSim
Read more: https://arxiv.org/abs/2306.08754. |
soddokayo/crime2 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
splits:
- name: train
num_bytes: 38639.2131147541
num_examples: 54
- name: test
num_bytes: 5008.786885245901
num_examples: 7
download_size: 17337
dataset_size: 43648.0
---
# Dataset Card for "crime2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/gr_sl8_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of gr_sl8/GrSL8/SL8 (Girls' Frontline)
This is the dataset of gr_sl8/GrSL8/SL8 (Girls' Frontline), containing 26 images and their tags.
The core tags of this character are `breasts, purple_eyes, grey_hair, short_hair, medium_breasts, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 26.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_sl8_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 17.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_sl8_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 32.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_sl8_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 24.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_sl8_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 41.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_sl8_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_sl8_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------|
| 0 | 26 |  |  |  |  |  | 1girl, solo, smile, cleavage, looking_at_viewer, jacket, navel, simple_background, teeth, white_background, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | cleavage | looking_at_viewer | jacket | navel | simple_background | teeth | white_background | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------|:--------------------|:---------|:--------|:--------------------|:--------|:-------------------|:----------|
| 0 | 26 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X |
|
mismatch-quest/SeeTRUE-Feedback | ---
configs:
- config_name: default
data_files:
- split: test
path: "test/*"
annotations_creators:
- crowdsourced
language:
- en
language_creators:
- found
license:
- cc-by-4.0
multilinguality:
- monolingual
paperswithcode_id: seetrue-feedback
pretty_name: SeeTRUE-feedback
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- text-image-matching
task_ids: []
extra_gated_prompt: "By clicking on “Access repository” below, you also agree that you are using it solely for research purposes, and that SeeTRUE-Feedback should be used as a *TEST SET*, not as a training set, and especially not to train commercial chatbots. Do not hessitate to contact briangordon@mail.tau.ac.il or yonatanbitton@google.com if you have questions about this license."
---
# Dataset Card for SeeTRUE-Feedback
- [Dataset Description](#dataset-description)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
The SeeTRUE-Feedback dataset is a diverse benchmark for the meta-evaluation of image-text matching/alignment feedback. It aims to overcome limitations in current benchmarks, which primarily focus on predicting a matching score between 0-1. SeeTRUE provides, for each row, the original caption, feedback related to text-image misalignment, and the caption+visual source of misalignments (including a bounding box for the visual misalignment).
### Languages
The dataset supports English language.
## Dataset Structure
### Data Fields
- image_caption - Caption associated with the image.
- image_name: The name of the image file.
- dataset_source: The source/origin dataset of the image.
- id_in_source_dataset: The ID of the dataset where the row originates from.
- image_url: An S3 link from which you can download the image.
- human_feedback: Human-annotated feedbacks about image-text misalignment.
- feedback: Summary of feedback consolidated into a single entry (Generated by LLM: PaLM-2)
- feedback_clean: A parsed and "clean" version of `feedback` field.
- caption_misalignment: Source of misalignment in the image caption.
- visual_misalignment: Source of misalignment in the image.
- bbox_GroundingDino: Detected visual misalignment bounding-box in GroundingDino output format.
- bbox_PaLI: Detected visual misalignment bounding-box in PaLI output format.
### Data Splits
SeeTRUE-Feedback contains a single split: TEST, and should not be used for training.
## Dataset Creation
The dataset has been created by sourcing and matching images and text from multiple datasets. More information in the paper: <TODO>
### Licensing Information
The dataset is under the CC-By 4.0 license.
### Citation Information
TODO |
HydraIndicLM/tamil_alpaca_dolly_51K | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
- name: system_prompt
dtype: string
splits:
- name: train
num_bytes: 287556653
num_examples: 51876
download_size: 84685617
dataset_size: 287556653
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## About
This repo contains a 51K instruction set for Tamil, translated from Alpaca and Dolly.
## Citation
If you find this repository useful, please consider giving 👏 and citing:
```
@misc{TamilAlpacaDolly,
author = {Sambit Sekhar and Shantipriya Parida},
title = {Tamil Instruction Set Based on Alpaca and Dolly},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/OdiaGenAI}},
}
```
|
open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v4-chatml | ---
pretty_name: Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v4-chatml
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [l3utterfly/mistral-7b-v0.1-layla-v4-chatml](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v4-chatml)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v4-chatml\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T09:05:25.657589](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v4-chatml/blob/main/results_2024-03-15T09-05-25.657589.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.637291875109227,\n\
\ \"acc_stderr\": 0.03233301893409507,\n \"acc_norm\": 0.6404117572756901,\n\
\ \"acc_norm_stderr\": 0.03298081985107007,\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.4302819836285525,\n\
\ \"mc2_stderr\": 0.01426088687933726\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403079,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.630551682931687,\n\
\ \"acc_stderr\": 0.004816690123209757,\n \"acc_norm\": 0.8339972117108145,\n\
\ \"acc_norm_stderr\": 0.003713227064225392\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.046482319871173156,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.046482319871173156\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.025167982333894143,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.025167982333894143\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094764,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094764\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343138,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343138\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808503,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808503\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069713,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045706,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045706\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806308,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806308\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2839657282741738,\n\
\ \"mc1_stderr\": 0.015785370858396725,\n \"mc2\": 0.4302819836285525,\n\
\ \"mc2_stderr\": 0.01426088687933726\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235802\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5382865807429871,\n \
\ \"acc_stderr\": 0.013732048227016682\n }\n}\n```"
repo_url: https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v4-chatml
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|arc:challenge|25_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|gsm8k|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hellaswag|10_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T09-05-25.657589.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T09-05-25.657589.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- '**/details_harness|winogrande|5_2024-03-15T09-05-25.657589.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T09-05-25.657589.parquet'
- config_name: results
data_files:
- split: 2024_03_15T09_05_25.657589
path:
- results_2024-03-15T09-05-25.657589.parquet
- split: latest
path:
- results_2024-03-15T09-05-25.657589.parquet
---
# Dataset Card for Evaluation run of l3utterfly/mistral-7b-v0.1-layla-v4-chatml
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [l3utterfly/mistral-7b-v0.1-layla-v4-chatml](https://huggingface.co/l3utterfly/mistral-7b-v0.1-layla-v4-chatml) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v4-chatml",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T09:05:25.657589](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__mistral-7b-v0.1-layla-v4-chatml/blob/main/results_2024-03-15T09-05-25.657589.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.637291875109227,
"acc_stderr": 0.03233301893409507,
"acc_norm": 0.6404117572756901,
"acc_norm_stderr": 0.03298081985107007,
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.4302819836285525,
"mc2_stderr": 0.01426088687933726
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403079,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.630551682931687,
"acc_stderr": 0.004816690123209757,
"acc_norm": 0.8339972117108145,
"acc_norm_stderr": 0.003713227064225392
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.31,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.025167982333894143,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.025167982333894143
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094764,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094764
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808503,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808503
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069713,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045706,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045706
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806308,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806308
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2839657282741738,
"mc1_stderr": 0.015785370858396725,
"mc2": 0.4302819836285525,
"mc2_stderr": 0.01426088687933726
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235802
},
"harness|gsm8k|5": {
"acc": 0.5382865807429871,
"acc_stderr": 0.013732048227016682
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gglab-ku/cogeval-human-SNLI-lalor | ---
license: apache-2.0
---
|
liuyanchen1015/MULTI_VALUE_mrpc_drop_aux_be_gonna | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 774
num_examples: 4
- name: train
num_bytes: 3388
num_examples: 14
- name: validation
num_bytes: 1008
num_examples: 4
download_size: 13603
dataset_size: 5170
---
# Dataset Card for "MULTI_VALUE_mrpc_drop_aux_be_gonna"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bloyal/deeploc | ---
license: cc-by-4.0
---
# DeepLoc-2.0 Training Data
Dataset from https://services.healthtech.dtu.dk/services/DeepLoc-2.0/ used to train the DeepLoc-2.0 model.
## Data preparation
Data downloaded and processed using the following Python script:
```python
import pandas as pd
df = pd.read_csv('https://services.healthtech.dtu.dk/services/DeepLoc-2.0/data/Swissprot_Train_Validation_dataset.csv').drop(['Unnamed: 0', 'Partition'], axis=1)
df['labels'] = df[['Cell membrane', 'Cytoplasm','Endoplasmic reticulum', 'Extracellular', 'Golgi apparatus', 'Lysosome/Vacuole', 'Mitochondrion', 'Nucleus', 'Peroxisome', 'Plastid']].astype('float32').values.tolist()
df['Membrane'] = df['Membrane'].astype('float32')
df = df[['Kingdom', 'ACC', 'Sequence','Membrane','labels']]
train = df.sample(frac=0.8)
df = df.drop(train.index)
val = df.sample(frac=0.5)
test = df.drop(val.index)
train = train.reset_index(drop=True)
val = val.reset_index(drop=True)
test = test.reset_index(drop=True)
train.to_parquet('deeploc-train.parquet', index=False)
val.to_parquet('deploc-val.parquet', index=False)
test.to_parquet('deeploc-test.parquet', index=False)
```
## Labels
{'Cell membrane': 0,
'Cytoplasm': 1,
'Endoplasmic reticulum': 2,
'Extracellular': 3,
'Golgi apparatus': 4,
'Lysosome/Vacuole': 5,
'Mitochondrion': 6,
'Nucleus': 7,
'Peroxisome': 8,
'Plastid': 9}
## Citation
**DeepLoc-2.0:**
```
Vineet Thumuluri and others, DeepLoc 2.0: multi-label subcellular localization prediction using protein language models, Nucleic Acids Research, Volume 50, Issue W1, 5 July 2022, Pages W228–W234, https://doi.org/10.1093/nar/gkac278
```
The DeepLoc data is a derivative of the UniProt dataset:
**UniProt**
```
The UniProt Consortium
UniProt: the Universal Protein Knowledgebase in 2023
Nucleic Acids Res. 51:D523–D531 (2023)
```
|
Cognitive-Lab/Indic-MMLU | ---
configs:
- config_name: kn
data_files:
- split: test
path: kn/test.json
- split: validation
path: kn/validation.json
- split: dev
path: kn/dev.json
- config_name: hi
data_files:
- split: test
path: hi/test.json
- split: validation
path: hi/validation.json
- split: dev
path: hi/dev.json
- config_name: ta
data_files:
- split: test
path: ta/test.json
- split: validation
path: ta/validation.json
- split: dev
path: ta/dev.json
- config_name: te
data_files:
- split: test
path: te/test.json
- split: validation
path: te/validation.json
- split: dev
path: te/dev.json
- config_name: ml
data_files:
- split: test
path: ml/test.json
- split: validation
path: ml/validation.json
- split: dev
path: ml/dev.json
- config_name: gu
data_files:
- split: test
path: gu/test.json
- split: validation
path: gu/validation.json
- split: dev
path: gu/dev.json
- config_name: mr
data_files:
- split: test
path: mr/test.json
- split: validation
path: mr/validation.json
- split: dev
path: mr/dev.json
---
# MMLU Translated
Citations:
```
@article{hendryckstest2021,
title={Measuring Massive Multitask Language Understanding},
author={Dan Hendrycks and Collin Burns and Steven Basart and Andy Zou and Mantas Mazeika and Dawn Song and Jacob Steinhardt},
journal={Proceedings of the International Conference on Learning Representations (ICLR)},
year={2021}
}
@article{hendrycks2021ethics,
title={Aligning AI With Shared Human Values},
author={Dan Hendrycks and Collin Burns and Steven Basart and Andrew Critch and Jerry Li and Dawn Song and Jacob Steinhardt},
journal={Proceedings of the International Conference on Learning Representations (ICLR)},
year={2021}
}
```
Contributions:\
Thanks to [@Srinidhi9113](https://huggingface.co/Srinidhi9113) for adding the dataset. |
TVRRaviteja/Mental-Health-Data | ---
language:
- en
---
# Mental Health Queries and Personality Dataset
## Overview
This dataset encompasses a collection of mental health queries paired with personality scores and responses generated by a Large Language Model (LLM). It aims to provide insights into the interplay between personality traits and mental health inquiries, facilitating research in personalized conversational agents and mental health support systems.
## Dataset Description
Each record in the dataset contains:
- A query from a Mental Health user.
- A personality score across five types: Agreeableness, Extraversion, Openness, Conscientiousness, and Neuroticism.
- A context interpretation based on the user's personality.
- A tailored response from the Assistant.
## Potential Uses
The dataset is particularly useful for researchers and developers working on:
- Personalized conversational AI in mental health.
- The impact of personality traits on mental health support.
- Enhancing natural language understanding and response generation in the context of mental health.
## Access and Use
This dataset is hosted on Hugging Face Datasets, available for academic and research purposes. Users are encouraged to cite the dataset when used in their research or projects.
---
license: mit
---
|
TalTechNLP/dialogsum_ee | ---
license: cc-by-4.0
dataset_info:
features:
- name: id
dtype: string
- name: dialogue
dtype: string
- name: summary
dtype: string
- name: topic
dtype: string
- name: en_dialogue
dtype: string
- name: en_summary
dtype: string
splits:
- name: train
num_bytes: 22666234
num_examples: 12460
- name: validation
num_bytes: 881912
num_examples: 500
- name: test
num_bytes: 2703111
num_examples: 1500
download_size: 14384437
dataset_size: 26251257
---
|
subAxiom/central-bank-digital-currencies | ---
license: cc
task_categories:
- text-generation
language:
- en
tags:
- finance
pretty_name: Central Bank Digital Currencies
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
chiragtubakad/flan-test-final | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: test
num_bytes: 1196001.0504610357
num_examples: 1000
download_size: 552002
dataset_size: 1196001.0504610357
---
# Dataset Card for "flan-test-final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangshuoming/c_x86_exebench_json_cleaned | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 749238025.3045925
num_examples: 701744
download_size: 209658460
dataset_size: 749238025.3045925
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "c_x86_exebench_json_cleaned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pankajmathur/alpaca_orca | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-generation
language:
- en
size_categories:
- 10K<n<100K
---
Explain tuned Alpaca dataset ~52K created using approaches from Orca Research Paper.
We leverage all of the 15 system instructions provided in Orca Research Paper. to generate custom datasets, in contrast to vanilla instruction tuning approaches used by original datasets.
This helps student models like [orca_mini_13b](https://huggingface.co/psmathur/orca_mini_13b) to learn thought process from teacher model, which is ChatGPT (gpt-3.5-turbo-0301 version).
Please see how the **System** prompt is added before each **instruction**. |
thercyl/NVDA | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: float64
- name: Ticker
dtype: string
- name: Year
dtype: string
- name: Text
dtype: string
- name: Embedding
dtype: string
splits:
- name: train
num_bytes: 68921754
num_examples: 1979
download_size: 40675215
dataset_size: 68921754
---
# Dataset Card for "NVDA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-computer_security-neg-answer | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_answer
dtype: string
splits:
- name: test
num_bytes: 31239
num_examples: 100
download_size: 22397
dataset_size: 31239
---
# Dataset Card for "mmlu-computer_security-neg-answer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lowo/ncep-TestData1 | ---
license: mit
---
|
Moatazz/First | ---
task_categories:
- text-classification
language:
- en
pretty_name: Trial
size_categories:
- n<1K
--- |
tner/bc5cdr | ---
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- token-classification
task_ids:
- named-entity-recognition
pretty_name: BioCreative V CDR
---
# Dataset Card for "tner/bc5cdr"
## Dataset Description
- **Repository:** [T-NER](https://github.com/asahi417/tner)
- **Paper:** [https://academic.oup.com/database/article/doi/10.1093/database/baw032/2630271?login=true](https://academic.oup.com/database/article/doi/10.1093/database/baw032/2630271?login=true)
- **Dataset:** BioCreative V CDR
- **Domain:** Biomedical
- **Number of Entity:** 2
### Dataset Summary
BioCreative V CDR NER dataset formatted in a part of [TNER](https://github.com/asahi417/tner) project.
The original dataset consists of long documents which cannot be fed on LM because of the length, so we split them into sentences to reduce their size.
- Entity Types: `Chemical`, `Disease`
## Dataset Structure
### Data Instances
An example of `train` looks as follows.
```
{
'tags': [2, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 0, 0],
'tokens': ['Fasciculations', 'in', 'six', 'areas', 'of', 'the', 'body', 'were', 'scored', 'from', '0', 'to', '3', 'and', 'summated', 'as', 'a', 'total', 'fasciculation', 'score', '.']
}
```
### Label ID
The label2id dictionary can be found at [here](https://huggingface.co/datasets/tner/bc5cdr/raw/main/dataset/label.json).
```python
{
"O": 0,
"B-Chemical": 1,
"B-Disease": 2,
"I-Disease": 3,
"I-Chemical": 4
}
```
### Data Splits
| name |train|validation|test|
|---------|----:|---------:|---:|
|bc5cdr|5228| 5330|5865|
### Citation Information
```
@article{wei2016assessing,
title={Assessing the state of the art in biomedical relation extraction: overview of the BioCreative V chemical-disease relation (CDR) task},
author={Wei, Chih-Hsuan and Peng, Yifan and Leaman, Robert and Davis, Allan Peter and Mattingly, Carolyn J and Li, Jiao and Wiegers, Thomas C and Lu, Zhiyong},
journal={Database},
volume={2016},
year={2016},
publisher={Oxford Academic}
}
``` |
davanstrien/on_the_books | ---
license: cc-by-3.0
language:
- en
tags:
- lam
pretty_name: On the Books
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
andrewrreed/fewnerd-person-names-augmented | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
splits:
- name: train
num_bytes: 42959061.57005247
num_examples: 122254
- name: validation
num_bytes: 4086233.0513204616
num_examples: 20417
- name: test
num_bytes: 8454146.29895592
num_examples: 32293
download_size: 14382598
dataset_size: 55499440.92032885
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
shidowake/oasst2_answers_from_g-ronimo_subset_split_0 | ---
dataset_info:
features:
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 7084599.121609153
num_examples: 2710
download_size: 3550248
dataset_size: 7084599.121609153
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
niizam/fgo-story | ---
license: cc-by-2.0
task_categories:
- translation
language:
- en
- ja
- id
tags:
- story
- conversation
size_categories:
- 1M<n<10M
--- |
lim4349/origin_added_korquad | ---
dataset_info:
features:
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: id
dtype: string
- name: answers
struct:
- name: text
sequence: string
- name: answer_start
sequence: int64
splits:
- name: train
num_bytes: 83769368
num_examples: 57923
- name: validation
num_bytes: 9244735
num_examples: 6436
download_size: 57373216
dataset_size: 93014103
---
# Dataset Card for "origin_added_korquad"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/Russian_Speech_Data_by_Mobile_Phone | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Russian_Speech_Data_by_Mobile_Phone
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/976?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
1960 Russian native speakers participated in the recording with authentic accent. The recorded script is designed by linguists and cover a wide range of topics including generic, interactive, in-vehicle and home. The text is manually proofread with high accuracy. It matches with mainstream Android and Apple system phones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/976?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Russian
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
harshgulati/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA | ---
pretty_name: Evaluation run of quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/QuantumLM-llama2-70B-Korean-LoRA](https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-30T07:53:24.183560](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA/blob/main/results_2023-08-30T07%3A53%3A24.183560.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6934168799483462,\n\
\ \"acc_stderr\": 0.03115919348812645,\n \"acc_norm\": 0.6971494359890498,\n\
\ \"acc_norm_stderr\": 0.031131669600877022,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5608488880093394,\n\
\ \"mc2_stderr\": 0.014874770245335572\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729119,\n\
\ \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.013318528460539422\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6743676558454491,\n\
\ \"acc_stderr\": 0.004676529200753001,\n \"acc_norm\": 0.8638717386974706,\n\
\ \"acc_norm_stderr\": 0.0034222387022263645\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.02555992055053101,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.02555992055053101\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8451612903225807,\n\
\ \"acc_stderr\": 0.020579287326583227,\n \"acc_norm\": 0.8451612903225807,\n\
\ \"acc_norm_stderr\": 0.020579287326583227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360756,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360756\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7256410256410256,\n \"acc_stderr\": 0.022622765767493225,\n\
\ \"acc_norm\": 0.7256410256410256,\n \"acc_norm_stderr\": 0.022622765767493225\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \
\ \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7857142857142857,\n \"acc_stderr\": 0.026653531596715484,\n\
\ \"acc_norm\": 0.7857142857142857,\n \"acc_norm_stderr\": 0.026653531596715484\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8844036697247707,\n \"acc_stderr\": 0.013708749534172636,\n \"\
acc_norm\": 0.8844036697247707,\n \"acc_norm_stderr\": 0.013708749534172636\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8734177215189873,\n \"acc_stderr\": 0.02164419572795517,\n \"\
acc_norm\": 0.8734177215189873,\n \"acc_norm_stderr\": 0.02164419572795517\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n\
\ \"acc_stderr\": 0.03434300243631,\n \"acc_norm\": 0.8518518518518519,\n\
\ \"acc_norm_stderr\": 0.03434300243631\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5267857142857143,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.5267857142857143,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8531289910600255,\n\
\ \"acc_stderr\": 0.012658201736147278,\n \"acc_norm\": 0.8531289910600255,\n\
\ \"acc_norm_stderr\": 0.012658201736147278\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.022698657167855713,\n\
\ \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.022698657167855713\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5094972067039106,\n\
\ \"acc_stderr\": 0.01671948464334877,\n \"acc_norm\": 0.5094972067039106,\n\
\ \"acc_norm_stderr\": 0.01671948464334877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.022021366100220194,\n\
\ \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.022021366100220194\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5567375886524822,\n \"acc_stderr\": 0.02963483847376601,\n \
\ \"acc_norm\": 0.5567375886524822,\n \"acc_norm_stderr\": 0.02963483847376601\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5560625814863103,\n\
\ \"acc_stderr\": 0.012689708167787677,\n \"acc_norm\": 0.5560625814863103,\n\
\ \"acc_norm_stderr\": 0.012689708167787677\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7536764705882353,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.7536764705882353,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7630718954248366,\n \"acc_stderr\": 0.01720166216978977,\n \
\ \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.01720166216978977\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.0267114305555384,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.0267114305555384\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n\
\ \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n\
\ \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.0256432399976243,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.0256432399976243\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.02709729011807082,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.02709729011807082\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5608488880093394,\n\
\ \"mc2_stderr\": 0.014874770245335572\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|arc:challenge|25_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hellaswag|10_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-30T07:53:24.183560.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T07:53:24.183560.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-30T07:53:24.183560.parquet'
- config_name: results
data_files:
- split: 2023_08_30T07_53_24.183560
path:
- results_2023-08-30T07:53:24.183560.parquet
- split: latest
path:
- results_2023-08-30T07:53:24.183560.parquet
---
# Dataset Card for Evaluation run of quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/QuantumLM-llama2-70B-Korean-LoRA](https://huggingface.co/quantumaikr/QuantumLM-llama2-70B-Korean-LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-30T07:53:24.183560](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__QuantumLM-llama2-70B-Korean-LoRA/blob/main/results_2023-08-30T07%3A53%3A24.183560.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6934168799483462,
"acc_stderr": 0.03115919348812645,
"acc_norm": 0.6971494359890498,
"acc_norm_stderr": 0.031131669600877022,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5608488880093394,
"mc2_stderr": 0.014874770245335572
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729119,
"acc_norm": 0.7056313993174061,
"acc_norm_stderr": 0.013318528460539422
},
"harness|hellaswag|10": {
"acc": 0.6743676558454491,
"acc_stderr": 0.004676529200753001,
"acc_norm": 0.8638717386974706,
"acc_norm_stderr": 0.0034222387022263645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.02555992055053101,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.02555992055053101
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360756,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360756
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7256410256410256,
"acc_stderr": 0.022622765767493225,
"acc_norm": 0.7256410256410256,
"acc_norm_stderr": 0.022622765767493225
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7857142857142857,
"acc_stderr": 0.026653531596715484,
"acc_norm": 0.7857142857142857,
"acc_norm_stderr": 0.026653531596715484
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8844036697247707,
"acc_stderr": 0.013708749534172636,
"acc_norm": 0.8844036697247707,
"acc_norm_stderr": 0.013708749534172636
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8734177215189873,
"acc_stderr": 0.02164419572795517,
"acc_norm": 0.8734177215189873,
"acc_norm_stderr": 0.02164419572795517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5267857142857143,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.5267857142857143,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8531289910600255,
"acc_stderr": 0.012658201736147278,
"acc_norm": 0.8531289910600255,
"acc_norm_stderr": 0.012658201736147278
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.022698657167855713,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.022698657167855713
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5094972067039106,
"acc_stderr": 0.01671948464334877,
"acc_norm": 0.5094972067039106,
"acc_norm_stderr": 0.01671948464334877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.022021366100220194,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.022021366100220194
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5567375886524822,
"acc_stderr": 0.02963483847376601,
"acc_norm": 0.5567375886524822,
"acc_norm_stderr": 0.02963483847376601
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5560625814863103,
"acc_stderr": 0.012689708167787677,
"acc_norm": 0.5560625814863103,
"acc_norm_stderr": 0.012689708167787677
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7536764705882353,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.7536764705882353,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.01720166216978977,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.01720166216978977
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.0267114305555384,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.0267114305555384
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.0256432399976243,
"acc_norm": 0.93,
"acc_norm_stderr": 0.0256432399976243
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.02709729011807082,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.02709729011807082
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5608488880093394,
"mc2_stderr": 0.014874770245335572
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
bigscience-data/roots_zh_ted_talks_iwslt | ---
language: zh
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_zh_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
japanese-asr/whisper_transcriptions.reazonspeech.all_12 | ---
dataset_info:
config_name: all
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 30338653855.0
num_examples: 266555
download_size: 30101931897
dataset_size: 30338653855.0
configs:
- config_name: all
data_files:
- split: train
path: all/train-*
---
|
open-llm-leaderboard/details_NoIdeaLand__test-4k-fn | ---
pretty_name: Evaluation run of NoIdeaLand/test-4k-fn
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NoIdeaLand/test-4k-fn](https://huggingface.co/NoIdeaLand/test-4k-fn) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NoIdeaLand__test-4k-fn\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-01T16:31:47.992543](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-4k-fn/blob/main/results_2023-10-01T16-31-47.992543.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2795859427889157,\n\
\ \"acc_stderr\": 0.03244654146727709,\n \"acc_norm\": 0.283431508310814,\n\
\ \"acc_norm_stderr\": 0.032446107426975616,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.38860179255046867,\n\
\ \"mc2_stderr\": 0.014093255696402213\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35665529010238906,\n \"acc_stderr\": 0.01399805690262019,\n\
\ \"acc_norm\": 0.3993174061433447,\n \"acc_norm_stderr\": 0.014312094557946704\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4971121290579566,\n\
\ \"acc_stderr\": 0.004989698183207823,\n \"acc_norm\": 0.6813383788090022,\n\
\ \"acc_norm_stderr\": 0.004650052150094427\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.027943219989337145,\n\
\ \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.027943219989337145\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617746,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617746\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n\
\ \"acc_stderr\": 0.03892431106518754,\n \"acc_norm\": 0.21929824561403508,\n\
\ \"acc_norm_stderr\": 0.03892431106518754\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.038552896163789485,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.038552896163789485\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24603174603174602,\n \"acc_stderr\": 0.022182037202948365,\n \"\
acc_norm\": 0.24603174603174602,\n \"acc_norm_stderr\": 0.022182037202948365\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.24838709677419354,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2019704433497537,\n \"acc_stderr\": 0.02824735012218027,\n\
\ \"acc_norm\": 0.2019704433497537,\n \"acc_norm_stderr\": 0.02824735012218027\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23232323232323232,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.23232323232323232,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3471502590673575,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.3471502590673575,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31025641025641026,\n \"acc_stderr\": 0.023454674889404288,\n\
\ \"acc_norm\": 0.31025641025641026,\n \"acc_norm_stderr\": 0.023454674889404288\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.036313298039696525,\n \"\
acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.036313298039696525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012397,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012397\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.03019028245350194,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.03019028245350194\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138605,\n \
\ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138605\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.03210062154134987,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.03210062154134987\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3803418803418803,\n\
\ \"acc_stderr\": 0.03180425204384099,\n \"acc_norm\": 0.3803418803418803,\n\
\ \"acc_norm_stderr\": 0.03180425204384099\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24521072796934865,\n\
\ \"acc_stderr\": 0.015384352284543936,\n \"acc_norm\": 0.24521072796934865,\n\
\ \"acc_norm_stderr\": 0.015384352284543936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.02344582627654555,\n\
\ \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.02344582627654555\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.02678745311190654,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.02678745311190654\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n\
\ \"acc_stderr\": 0.024619771956697165,\n \"acc_norm\": 0.2508038585209003,\n\
\ \"acc_norm_stderr\": 0.024619771956697165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.22530864197530864,\n \"acc_stderr\": 0.02324620264781975,\n\
\ \"acc_norm\": 0.22530864197530864,\n \"acc_norm_stderr\": 0.02324620264781975\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2777053455019557,\n\
\ \"acc_stderr\": 0.011438741422769575,\n \"acc_norm\": 0.2777053455019557,\n\
\ \"acc_norm_stderr\": 0.011438741422769575\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487428,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487428\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28104575163398693,\n \"acc_stderr\": 0.018185218954318075,\n \
\ \"acc_norm\": 0.28104575163398693,\n \"acc_norm_stderr\": 0.018185218954318075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399677,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399677\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2891566265060241,\n\
\ \"acc_stderr\": 0.03529486801511114,\n \"acc_norm\": 0.2891566265060241,\n\
\ \"acc_norm_stderr\": 0.03529486801511114\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080512,\n \"mc2\": 0.38860179255046867,\n\
\ \"mc2_stderr\": 0.014093255696402213\n }\n}\n```"
repo_url: https://huggingface.co/NoIdeaLand/test-4k-fn
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|arc:challenge|25_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hellaswag|10_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-01T16-31-47.992543.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T16-31-47.992543.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-01T16-31-47.992543.parquet'
- config_name: results
data_files:
- split: 2023_10_01T16_31_47.992543
path:
- results_2023-10-01T16-31-47.992543.parquet
- split: latest
path:
- results_2023-10-01T16-31-47.992543.parquet
---
# Dataset Card for Evaluation run of NoIdeaLand/test-4k-fn
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NoIdeaLand/test-4k-fn
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NoIdeaLand/test-4k-fn](https://huggingface.co/NoIdeaLand/test-4k-fn) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NoIdeaLand__test-4k-fn",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-01T16:31:47.992543](https://huggingface.co/datasets/open-llm-leaderboard/details_NoIdeaLand__test-4k-fn/blob/main/results_2023-10-01T16-31-47.992543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2795859427889157,
"acc_stderr": 0.03244654146727709,
"acc_norm": 0.283431508310814,
"acc_norm_stderr": 0.032446107426975616,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.38860179255046867,
"mc2_stderr": 0.014093255696402213
},
"harness|arc:challenge|25": {
"acc": 0.35665529010238906,
"acc_stderr": 0.01399805690262019,
"acc_norm": 0.3993174061433447,
"acc_norm_stderr": 0.014312094557946704
},
"harness|hellaswag|10": {
"acc": 0.4971121290579566,
"acc_stderr": 0.004989698183207823,
"acc_norm": 0.6813383788090022,
"acc_norm_stderr": 0.004650052150094427
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.29056603773584905,
"acc_stderr": 0.027943219989337145,
"acc_norm": 0.29056603773584905,
"acc_norm_stderr": 0.027943219989337145
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617746,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617746
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.03892431106518754,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.03892431106518754
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.038552896163789485,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.038552896163789485
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.022182037202948365,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.022182037202948365
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2019704433497537,
"acc_stderr": 0.02824735012218027,
"acc_norm": 0.2019704433497537,
"acc_norm_stderr": 0.02824735012218027
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23232323232323232,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.23232323232323232,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3471502590673575,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.3471502590673575,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31025641025641026,
"acc_stderr": 0.023454674889404288,
"acc_norm": 0.31025641025641026,
"acc_norm_stderr": 0.023454674889404288
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.036313298039696525,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.036313298039696525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012397,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012397
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.03019028245350194,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.03019028245350194
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.029936696387138605,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.029936696387138605
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.03210062154134987,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.03210062154134987
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3803418803418803,
"acc_stderr": 0.03180425204384099,
"acc_norm": 0.3803418803418803,
"acc_norm_stderr": 0.03180425204384099
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24521072796934865,
"acc_stderr": 0.015384352284543936,
"acc_norm": 0.24521072796934865,
"acc_norm_stderr": 0.015384352284543936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.02344582627654555,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.02344582627654555
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.02678745311190654,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.02678745311190654
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697165,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22530864197530864,
"acc_stderr": 0.02324620264781975,
"acc_norm": 0.22530864197530864,
"acc_norm_stderr": 0.02324620264781975
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2777053455019557,
"acc_stderr": 0.011438741422769575,
"acc_norm": 0.2777053455019557,
"acc_norm_stderr": 0.011438741422769575
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487428,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487428
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28104575163398693,
"acc_stderr": 0.018185218954318075,
"acc_norm": 0.28104575163398693,
"acc_norm_stderr": 0.018185218954318075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2891566265060241,
"acc_stderr": 0.03529486801511114,
"acc_norm": 0.2891566265060241,
"acc_norm_stderr": 0.03529486801511114
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080512,
"mc2": 0.38860179255046867,
"mc2_stderr": 0.014093255696402213
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
umarigan/turkish_corpus | ---
license: mit
task_categories:
- feature-extraction
language:
- tr
pretty_name: Corpus
size_categories:
- 10M<n<100M
--- |
huggingface-course/documentation-images | ---
license: apache-2.0
---
|
RoryCochrane/pokemon-and-fakemon | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 480609633.745
num_examples: 4763
download_size: 391516344
dataset_size: 480609633.745
---
# Dataset Card for "pokemon-and-fakemon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hassan_of_the_serenity_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hassan_of_the_serenity/静謐のハサン/静谧哈桑 (Fate/Grand Order)
This is the dataset of hassan_of_the_serenity/静謐のハサン/静谧哈桑 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `purple_hair, dark_skin, dark-skinned_female, purple_eyes, short_hair, breasts, hair_between_eyes, hairband, medium_breasts, black_hairband, very_dark_skin`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 578.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 497.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1186 | 977.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hassan_of_the_serenity_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hassan_of_the_serenity_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 23 |  |  |  |  |  | 1girl, solo, bare_shoulders, black_gloves, looking_at_viewer, center_opening, black_leotard, navel, white_background, simple_background, fingerless_gloves, holding, weapon, parted_lips |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_gloves, center_opening, fingerless_gloves, navel, solo, black_leotard, blush, cleavage, looking_at_viewer, open_mouth |
| 2 | 14 |  |  |  |  |  | 1girl, ass, backless_outfit, bare_back, bare_shoulders, black_gloves, fingerless_gloves, from_behind, solo, looking_at_viewer, looking_back, holding_weapon, leotard, sideboob, kunai, between_fingers, butt_crack, knife, simple_background, white_background, leggings, night, sky, toeless_legwear |
| 3 | 15 |  |  |  |  |  | 1girl, bare_shoulders, sleeveless_dress, solo, white_dress, collarbone, white_background, looking_at_viewer, blush, sidelocks, simple_background, bare_arms, upper_body, closed_mouth, parted_lips, sundress |
| 4 | 7 |  |  |  |  |  | 1girl, black_shirt, blush, closed_mouth, solo, white_background, looking_at_viewer, collarbone, long_sleeves, sidelocks, simple_background, sleeves_past_wrists, smile, hand_up, upper_body |
| 5 | 12 |  |  |  |  |  | 1girl, blush, looking_at_viewer, puffy_long_sleeves, solo, hood_down, sleeves_past_wrists, smile, white_background, drawstring, simple_background, black_hoodie, closed_mouth, :>, v-shaped_eyebrows, hand_up |
| 6 | 6 |  |  |  |  |  | 1girl, collared_shirt, long_sleeves, looking_at_viewer, pleated_skirt, school_uniform, solo, white_background, white_shirt, blush, plaid_skirt, sleeves_past_wrists, smile, alternate_costume, blazer, closed_mouth, open_jacket, sidelocks, simple_background, black_jacket, black_skirt, bowtie, sweater |
| 7 | 42 |  |  |  |  |  | 1girl, bare_shoulders, solo, detached_sleeves, official_alternate_costume, looking_at_viewer, detached_collar, hair_flower, long_sleeves, white_dress, ribbon, strapless_dress, blush, bow, red_apple, smile, ahoge, holding_fruit, closed_mouth, pink_dress |
| 8 | 10 |  |  |  |  |  | 1girl, bare_shoulders, bell, looking_at_viewer, solo, blush, christmas, smile, white_thighhighs, white_panties, navel, ribbon-trimmed_legwear, red_bow, sheep_horns, underboob, closed_mouth, gift_box, sitting, stomach, bare_arms, colored_skin, fireplace, fur_collar, indoors, sidelocks |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | black_gloves | looking_at_viewer | center_opening | black_leotard | navel | white_background | simple_background | fingerless_gloves | holding | weapon | parted_lips | blush | cleavage | open_mouth | ass | backless_outfit | bare_back | from_behind | looking_back | holding_weapon | leotard | sideboob | kunai | between_fingers | butt_crack | knife | leggings | night | sky | toeless_legwear | sleeveless_dress | white_dress | collarbone | sidelocks | bare_arms | upper_body | closed_mouth | sundress | black_shirt | long_sleeves | sleeves_past_wrists | smile | hand_up | puffy_long_sleeves | hood_down | drawstring | black_hoodie | :> | v-shaped_eyebrows | collared_shirt | pleated_skirt | school_uniform | white_shirt | plaid_skirt | alternate_costume | blazer | open_jacket | black_jacket | black_skirt | bowtie | sweater | detached_sleeves | official_alternate_costume | detached_collar | hair_flower | ribbon | strapless_dress | bow | red_apple | ahoge | holding_fruit | pink_dress | bell | christmas | white_thighhighs | white_panties | ribbon-trimmed_legwear | red_bow | sheep_horns | underboob | gift_box | sitting | stomach | colored_skin | fireplace | fur_collar | indoors |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:---------------|:--------------------|:-----------------|:----------------|:--------|:-------------------|:--------------------|:--------------------|:----------|:---------|:--------------|:--------|:-----------|:-------------|:------|:------------------|:------------|:--------------|:---------------|:-----------------|:----------|:-----------|:--------|:------------------|:-------------|:--------|:-----------|:--------|:------|:------------------|:-------------------|:--------------|:-------------|:------------|:------------|:-------------|:---------------|:-----------|:--------------|:---------------|:----------------------|:--------|:----------|:---------------------|:------------|:-------------|:---------------|:-----|:--------------------|:-----------------|:----------------|:-----------------|:--------------|:--------------|:--------------------|:---------|:--------------|:---------------|:--------------|:---------|:----------|:-------------------|:-----------------------------|:------------------|:--------------|:---------|:------------------|:------|:------------|:--------|:----------------|:-------------|:-------|:------------|:-------------------|:----------------|:-------------------------|:----------|:--------------|:------------|:-----------|:----------|:----------|:---------------|:------------|:-------------|:----------|
| 0 | 23 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | X | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | | | | X | X | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | X | X | X | | X | | | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 |  |  |  |  |  | X | X | | | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | | | X | | | | X | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 42 |  |  |  |  |  | X | X | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | X | | | X | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 8 | 10 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | X | X | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
cgulse/alpaca-cleaned-tr | ---
license: cc-by-4.0
language:
- tr
tags:
- alpaca
- instruction-finetuning
pretty_name: Turkish Alpaca-cleaned
size_categories:
- 10K<n<100K
---
Alpaca Cleaned Dataset.
Machine Translated facebook/nllb-200-3.3B
Languages
Turkish |
open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2 | ---
pretty_name: Evaluation run of g-ronimo/phi-2-OpenHermes-2.5-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [g-ronimo/phi-2-OpenHermes-2.5-v2](https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:49:09.888984](https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2/blob/main/results_2024-03-10T00-49-09.888984.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.564734458241999,\n\
\ \"acc_stderr\": 0.03391431521091429,\n \"acc_norm\": 0.5676857564160381,\n\
\ \"acc_norm_stderr\": 0.03461774832252384,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.44887128521126124,\n\
\ \"mc2_stderr\": 0.015342799330160783\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.01446763155913799,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216388\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5592511451902011,\n\
\ \"acc_stderr\": 0.004954622308738996,\n \"acc_norm\": 0.7456681935869349,\n\
\ \"acc_norm_stderr\": 0.004345949382382379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.026860206444724352,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.026860206444724352\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315967,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868578,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868578\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871916,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335428,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335428\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.016740929047162696,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.016740929047162696\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.02546977014940017,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.02546977014940017\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2212290502793296,\n\
\ \"acc_stderr\": 0.01388216459888727,\n \"acc_norm\": 0.2212290502793296,\n\
\ \"acc_norm_stderr\": 0.01388216459888727\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.02803609227389177,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.02803609227389177\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516468,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n\
\ \"acc_stderr\": 0.012552598958563662,\n \"acc_norm\": 0.40808344198174706,\n\
\ \"acc_norm_stderr\": 0.012552598958563662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459595,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459595\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.44887128521126124,\n\
\ \"mc2_stderr\": 0.015342799330160783\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865353\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4040940106141016,\n \
\ \"acc_stderr\": 0.013516752972721717\n }\n}\n```"
repo_url: https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|winogrande|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-49-09.888984.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- results_2024-03-10T00-49-09.888984.parquet
- split: latest
path:
- results_2024-03-10T00-49-09.888984.parquet
---
# Dataset Card for Evaluation run of g-ronimo/phi-2-OpenHermes-2.5-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [g-ronimo/phi-2-OpenHermes-2.5-v2](https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:49:09.888984](https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2/blob/main/results_2024-03-10T00-49-09.888984.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.564734458241999,
"acc_stderr": 0.03391431521091429,
"acc_norm": 0.5676857564160381,
"acc_norm_stderr": 0.03461774832252384,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.44887128521126124,
"mc2_stderr": 0.015342799330160783
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.01446763155913799,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216388
},
"harness|hellaswag|10": {
"acc": 0.5592511451902011,
"acc_stderr": 0.004954622308738996,
"acc_norm": 0.7456681935869349,
"acc_norm_stderr": 0.004345949382382379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697031,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697031
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724352,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.036974422050315967,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.036974422050315967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871916,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335428,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335428
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.016740929047162696,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.016740929047162696
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.02546977014940017,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.02546977014940017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2212290502793296,
"acc_stderr": 0.01388216459888727,
"acc_norm": 0.2212290502793296,
"acc_norm_stderr": 0.01388216459888727
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.02803609227389177,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.02803609227389177
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516468,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.012552598958563662,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.012552598958563662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459595,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459595
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.44887128521126124,
"mc2_stderr": 0.015342799330160783
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.012134386019865353
},
"harness|gsm8k|5": {
"acc": 0.4040940106141016,
"acc_stderr": 0.013516752972721717
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Technoculture__MT7Bi-sft | ---
pretty_name: Evaluation run of Technoculture/MT7Bi-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Technoculture/MT7Bi-sft](https://huggingface.co/Technoculture/MT7Bi-sft) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Technoculture__MT7Bi-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-01T14:25:40.116952](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-sft/blob/main/results_2024-02-01T14-25-40.116952.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4105658357592459,\n\
\ \"acc_stderr\": 0.03434113134801399,\n \"acc_norm\": 0.416672739687421,\n\
\ \"acc_norm_stderr\": 0.03527569703844115,\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.4460516469060258,\n\
\ \"mc2_stderr\": 0.01603355318388596\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3796928327645051,\n \"acc_stderr\": 0.014182119866974874,\n\
\ \"acc_norm\": 0.4180887372013652,\n \"acc_norm_stderr\": 0.014413988396996074\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4629555865365465,\n\
\ \"acc_stderr\": 0.004976067726432563,\n \"acc_norm\": 0.5683130850428202,\n\
\ \"acc_norm_stderr\": 0.004942990623131126\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.03077265364207565,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.03077265364207565\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.04132125019723369,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.04132125019723369\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929774,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929774\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.043036840335373146,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.043036840335373146\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.02300008685906864,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.02300008685906864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4935483870967742,\n \"acc_stderr\": 0.02844163823354051,\n \"\
acc_norm\": 0.4935483870967742,\n \"acc_norm_stderr\": 0.02844163823354051\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.0309037969521145,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.0309037969521145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.47878787878787876,\n \"acc_stderr\": 0.039008289137373,\n\
\ \"acc_norm\": 0.47878787878787876,\n \"acc_norm_stderr\": 0.039008289137373\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.03559443565563921,\n \"\
acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.03559443565563921\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442206,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442206\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3435897435897436,\n \"acc_stderr\": 0.024078696580635484,\n\
\ \"acc_norm\": 0.3435897435897436,\n \"acc_norm_stderr\": 0.024078696580635484\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3697478991596639,\n \"acc_stderr\": 0.031357095996135904,\n\
\ \"acc_norm\": 0.3697478991596639,\n \"acc_norm_stderr\": 0.031357095996135904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5522935779816514,\n \"acc_stderr\": 0.021319754962425462,\n \"\
acc_norm\": 0.5522935779816514,\n \"acc_norm_stderr\": 0.021319754962425462\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.35648148148148145,\n \"acc_stderr\": 0.032664783315272714,\n \"\
acc_norm\": 0.35648148148148145,\n \"acc_norm_stderr\": 0.032664783315272714\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.37745098039215685,\n \"acc_stderr\": 0.03402272044340703,\n \"\
acc_norm\": 0.37745098039215685,\n \"acc_norm_stderr\": 0.03402272044340703\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5780590717299579,\n \"acc_stderr\": 0.032148146302403695,\n \
\ \"acc_norm\": 0.5780590717299579,\n \"acc_norm_stderr\": 0.032148146302403695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47533632286995514,\n\
\ \"acc_stderr\": 0.033516951676526276,\n \"acc_norm\": 0.47533632286995514,\n\
\ \"acc_norm_stderr\": 0.033516951676526276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n\
\ \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.038946411200447915,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.038946411200447915\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5048543689320388,\n \"acc_stderr\": 0.04950504382128921,\n\
\ \"acc_norm\": 0.5048543689320388,\n \"acc_norm_stderr\": 0.04950504382128921\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6025641025641025,\n\
\ \"acc_stderr\": 0.03205953453789293,\n \"acc_norm\": 0.6025641025641025,\n\
\ \"acc_norm_stderr\": 0.03205953453789293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4661558109833972,\n\
\ \"acc_stderr\": 0.017838956009136805,\n \"acc_norm\": 0.4661558109833972,\n\
\ \"acc_norm_stderr\": 0.017838956009136805\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.02675625512966377,\n\
\ \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.02675625512966377\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.014149575348976259,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.014149575348976259\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4437299035369775,\n\
\ \"acc_stderr\": 0.028217683556652308,\n \"acc_norm\": 0.4437299035369775,\n\
\ \"acc_norm_stderr\": 0.028217683556652308\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.42901234567901236,\n \"acc_stderr\": 0.027538925613470863,\n\
\ \"acc_norm\": 0.42901234567901236,\n \"acc_norm_stderr\": 0.027538925613470863\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.028195534873966734,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.028195534873966734\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3057366362451108,\n\
\ \"acc_stderr\": 0.011766973847072914,\n \"acc_norm\": 0.3057366362451108,\n\
\ \"acc_norm_stderr\": 0.011766973847072914\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003486,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003486\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.39705882352941174,\n \"acc_stderr\": 0.01979448890002411,\n \
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.01979448890002411\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n\
\ \"acc_stderr\": 0.04750185058907297,\n \"acc_norm\": 0.43636363636363634,\n\
\ \"acc_norm_stderr\": 0.04750185058907297\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49795918367346936,\n \"acc_stderr\": 0.0320089533497105,\n\
\ \"acc_norm\": 0.49795918367346936,\n \"acc_norm_stderr\": 0.0320089533497105\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n\
\ \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.5472636815920398,\n\
\ \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562427,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562427\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.03834234744164993,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.03834234744164993\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2802937576499388,\n\
\ \"mc1_stderr\": 0.015723139524608767,\n \"mc2\": 0.4460516469060258,\n\
\ \"mc2_stderr\": 0.01603355318388596\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6045777426992897,\n \"acc_stderr\": 0.013741678387545352\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Technoculture/MT7Bi-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|arc:challenge|25_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|gsm8k|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hellaswag|10_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T14-25-40.116952.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-01T14-25-40.116952.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- '**/details_harness|winogrande|5_2024-02-01T14-25-40.116952.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-01T14-25-40.116952.parquet'
- config_name: results
data_files:
- split: 2024_02_01T14_25_40.116952
path:
- results_2024-02-01T14-25-40.116952.parquet
- split: latest
path:
- results_2024-02-01T14-25-40.116952.parquet
---
# Dataset Card for Evaluation run of Technoculture/MT7Bi-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Technoculture/MT7Bi-sft](https://huggingface.co/Technoculture/MT7Bi-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Technoculture__MT7Bi-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-01T14:25:40.116952](https://huggingface.co/datasets/open-llm-leaderboard/details_Technoculture__MT7Bi-sft/blob/main/results_2024-02-01T14-25-40.116952.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4105658357592459,
"acc_stderr": 0.03434113134801399,
"acc_norm": 0.416672739687421,
"acc_norm_stderr": 0.03527569703844115,
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.4460516469060258,
"mc2_stderr": 0.01603355318388596
},
"harness|arc:challenge|25": {
"acc": 0.3796928327645051,
"acc_stderr": 0.014182119866974874,
"acc_norm": 0.4180887372013652,
"acc_norm_stderr": 0.014413988396996074
},
"harness|hellaswag|10": {
"acc": 0.4629555865365465,
"acc_stderr": 0.004976067726432563,
"acc_norm": 0.5683130850428202,
"acc_norm_stderr": 0.004942990623131126
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.03077265364207565,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.03077265364207565
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.04132125019723369,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.04132125019723369
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929774,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929774
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.043036840335373146,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.043036840335373146
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.02300008685906864,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.02300008685906864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.0309037969521145,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.0309037969521145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.47878787878787876,
"acc_stderr": 0.039008289137373,
"acc_norm": 0.47878787878787876,
"acc_norm_stderr": 0.039008289137373
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.03559443565563921,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.03559443565563921
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442206,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442206
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3435897435897436,
"acc_stderr": 0.024078696580635484,
"acc_norm": 0.3435897435897436,
"acc_norm_stderr": 0.024078696580635484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3697478991596639,
"acc_stderr": 0.031357095996135904,
"acc_norm": 0.3697478991596639,
"acc_norm_stderr": 0.031357095996135904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5522935779816514,
"acc_stderr": 0.021319754962425462,
"acc_norm": 0.5522935779816514,
"acc_norm_stderr": 0.021319754962425462
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.35648148148148145,
"acc_stderr": 0.032664783315272714,
"acc_norm": 0.35648148148148145,
"acc_norm_stderr": 0.032664783315272714
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.37745098039215685,
"acc_stderr": 0.03402272044340703,
"acc_norm": 0.37745098039215685,
"acc_norm_stderr": 0.03402272044340703
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5780590717299579,
"acc_stderr": 0.032148146302403695,
"acc_norm": 0.5780590717299579,
"acc_norm_stderr": 0.032148146302403695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47533632286995514,
"acc_stderr": 0.033516951676526276,
"acc_norm": 0.47533632286995514,
"acc_norm_stderr": 0.033516951676526276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.038946411200447915,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.038946411200447915
},
"harness|hendrycksTest-management|5": {
"acc": 0.5048543689320388,
"acc_stderr": 0.04950504382128921,
"acc_norm": 0.5048543689320388,
"acc_norm_stderr": 0.04950504382128921
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.03205953453789293,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.03205953453789293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4661558109833972,
"acc_stderr": 0.017838956009136805,
"acc_norm": 0.4661558109833972,
"acc_norm_stderr": 0.017838956009136805
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.02675625512966377,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.02675625512966377
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976259,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976259
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4437299035369775,
"acc_stderr": 0.028217683556652308,
"acc_norm": 0.4437299035369775,
"acc_norm_stderr": 0.028217683556652308
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.42901234567901236,
"acc_stderr": 0.027538925613470863,
"acc_norm": 0.42901234567901236,
"acc_norm_stderr": 0.027538925613470863
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.028195534873966734,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.028195534873966734
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3057366362451108,
"acc_stderr": 0.011766973847072914,
"acc_norm": 0.3057366362451108,
"acc_norm_stderr": 0.011766973847072914
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003486,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003486
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.01979448890002411,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.01979448890002411
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907297,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907297
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49795918367346936,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.49795918367346936,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5472636815920398,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.5472636815920398,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.03834234744164993,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.03834234744164993
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2802937576499388,
"mc1_stderr": 0.015723139524608767,
"mc2": 0.4460516469060258,
"mc2_stderr": 0.01603355318388596
},
"harness|winogrande|5": {
"acc": 0.6045777426992897,
"acc_stderr": 0.013741678387545352
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.