id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
DRAGOO/dataset_dyal_darija_T5 | 2023-08-28T19:07:23.000Z | [
"region:us"
] | DRAGOO | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: conversation
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 83191.2
num_examples: 72
- name: test
num_bytes: 20797.8
num_examples: 18
download_size: 72585
dataset_size: 103989.0
---
# Dataset Card for "dataset_dyal_darija_T5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjoernp/code_search_net_python_filtered_top50k | 2023-08-28T19:14:47.000Z | [
"region:us"
] | bjoernp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: code
dtype: string
- name: signature
dtype: string
- name: docstring
dtype: string
- name: loss_without_docstring
dtype: float64
- name: loss_with_docstring
dtype: float64
- name: factor
dtype: float64
splits:
- name: train
num_bytes: 46636060.557325035
num_examples: 50023
download_size: 15599036
dataset_size: 46636060.557325035
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_search_net_python_filtered_top50k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjoernp/code_search_net_python_processed_400k | 2023-08-28T19:22:06.000Z | [
"region:us"
] | bjoernp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: code
dtype: string
- name: signature
dtype: string
- name: docstring
dtype: string
- name: loss_without_docstring
dtype: float64
- name: loss_with_docstring
dtype: float64
- name: factor
dtype: float64
splits:
- name: train
num_bytes: 373144422
num_examples: 400244
download_size: 150980039
dataset_size: 373144422
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_search_net_python_processed_400k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bjoernp/code_search_net_filtered_top100 | 2023-08-28T19:24:24.000Z | [
"region:us"
] | bjoernp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: code
dtype: string
- name: signature
dtype: string
- name: docstring
dtype: string
- name: loss_without_docstring
dtype: float64
- name: loss_with_docstring
dtype: float64
- name: factor
dtype: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 166530
num_examples: 100
download_size: 70792
dataset_size: 166530
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_search_net_filtered_top100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polymath707/indollama-70B | 2023-08-28T19:36:14.000Z | [
"license:apache-2.0",
"region:us"
] | polymath707 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.1 | 2023-08-31T13:45:04.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-l2-70b-2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-l2-70b-2.1](https://huggingface.co/jondurbin/airoboros-l2-70b-2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T13:20:37.537573](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.1/blob/main/results_2023-08-31T13%3A20%3A37.537573.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6911102060941249,\n\
\ \"acc_stderr\": 0.03123566483237726,\n \"acc_norm\": 0.6949709994967216,\n\
\ \"acc_norm_stderr\": 0.031206423026103478,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5600273117673736,\n\
\ \"mc2_stderr\": 0.014903116753397212\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760427,\n\
\ \"acc_norm\": 0.7064846416382252,\n \"acc_norm_stderr\": 0.013307250444941117\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.676956781517626,\n\
\ \"acc_stderr\": 0.004666833452796184,\n \"acc_norm\": 0.8680541724756025,\n\
\ \"acc_norm_stderr\": 0.0033774020414626175\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544067,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8263888888888888,\n\
\ \"acc_stderr\": 0.03167473383795718,\n \"acc_norm\": 0.8263888888888888,\n\
\ \"acc_norm_stderr\": 0.03167473383795718\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n\
\ \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859372,\n \"\
acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859372\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8451612903225807,\n\
\ \"acc_stderr\": 0.020579287326583227,\n \"acc_norm\": 0.8451612903225807,\n\
\ \"acc_norm_stderr\": 0.020579287326583227\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.035145285621750094,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.035145285621750094\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.03011768892950359,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03011768892950359\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"\
acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9481865284974094,\n \"acc_stderr\": 0.01599622932024412,\n\
\ \"acc_norm\": 0.9481865284974094,\n \"acc_norm_stderr\": 0.01599622932024412\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.022421273612923714,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.022421273612923714\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985741,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985741\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827948,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8880733944954129,\n \"acc_stderr\": 0.013517352714958788,\n \"\
acc_norm\": 0.8880733944954129,\n \"acc_norm_stderr\": 0.013517352714958788\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5879629629629629,\n \"acc_stderr\": 0.03356787758160831,\n \"\
acc_norm\": 0.5879629629629629,\n \"acc_norm_stderr\": 0.03356787758160831\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969427,\n \"acc_norm\"\
: 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969427\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.869198312236287,\n \"acc_stderr\": 0.02194876605947076,\n \"acc_norm\"\
: 0.869198312236287,\n \"acc_norm_stderr\": 0.02194876605947076\n },\n\
\ \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n\
\ \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n\
\ \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.032785485373431386,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.032785485373431386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8925619834710744,\n \"acc_stderr\": 0.02826881219254063,\n \"\
acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.02826881219254063\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.03083349114628123,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.03083349114628123\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8492975734355045,\n\
\ \"acc_stderr\": 0.01279342088312082,\n \"acc_norm\": 0.8492975734355045,\n\
\ \"acc_norm_stderr\": 0.01279342088312082\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7774566473988439,\n \"acc_stderr\": 0.02239421566194282,\n\
\ \"acc_norm\": 0.7774566473988439,\n \"acc_norm_stderr\": 0.02239421566194282\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5050279329608939,\n\
\ \"acc_stderr\": 0.01672165603753842,\n \"acc_norm\": 0.5050279329608939,\n\
\ \"acc_norm_stderr\": 0.01672165603753842\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7712418300653595,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.7712418300653595,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n\
\ \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n\
\ \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.02175186606081588,\n\
\ \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.02175186606081588\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5602836879432624,\n \"acc_stderr\": 0.02960991207559412,\n \
\ \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.02960991207559412\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5508474576271186,\n\
\ \"acc_stderr\": 0.012704030518851472,\n \"acc_norm\": 0.5508474576271186,\n\
\ \"acc_norm_stderr\": 0.012704030518851472\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103142,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146613,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146613\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7877551020408163,\n \"acc_stderr\": 0.026176967197866764,\n\
\ \"acc_norm\": 0.7877551020408163,\n \"acc_norm_stderr\": 0.026176967197866764\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9104477611940298,\n\
\ \"acc_stderr\": 0.02019067053502791,\n \"acc_norm\": 0.9104477611940298,\n\
\ \"acc_norm_stderr\": 0.02019067053502791\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646612,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646612\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5600273117673736,\n\
\ \"mc2_stderr\": 0.014903116753397212\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-l2-70b-2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|arc:challenge|25_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hellaswag|10_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T19:47:49.813088.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:20:37.537573.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:20:37.537573.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T19:47:49.813088.parquet'
- split: 2023_08_31T13_20_37.537573
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:20:37.537573.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:20:37.537573.parquet'
- config_name: results
data_files:
- split: 2023_08_28T19_47_49.813088
path:
- results_2023-08-28T19:47:49.813088.parquet
- split: 2023_08_31T13_20_37.537573
path:
- results_2023-08-31T13:20:37.537573.parquet
- split: latest
path:
- results_2023-08-31T13:20:37.537573.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-70b-2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-2.1](https://huggingface.co/jondurbin/airoboros-l2-70b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T13:20:37.537573](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-2.1/blob/main/results_2023-08-31T13%3A20%3A37.537573.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6911102060941249,
"acc_stderr": 0.03123566483237726,
"acc_norm": 0.6949709994967216,
"acc_norm_stderr": 0.031206423026103478,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5600273117673736,
"mc2_stderr": 0.014903116753397212
},
"harness|arc:challenge|25": {
"acc": 0.6697952218430034,
"acc_stderr": 0.013743085603760427,
"acc_norm": 0.7064846416382252,
"acc_norm_stderr": 0.013307250444941117
},
"harness|hellaswag|10": {
"acc": 0.676956781517626,
"acc_stderr": 0.004666833452796184,
"acc_norm": 0.8680541724756025,
"acc_norm_stderr": 0.0033774020414626175
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544067,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8263888888888888,
"acc_stderr": 0.03167473383795718,
"acc_norm": 0.8263888888888888,
"acc_norm_stderr": 0.03167473383795718
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859372,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859372
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.035145285621750094,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.035145285621750094
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03011768892950359,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03011768892950359
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9481865284974094,
"acc_stderr": 0.01599622932024412,
"acc_norm": 0.9481865284974094,
"acc_norm_stderr": 0.01599622932024412
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.022421273612923714,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.022421273612923714
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985741,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985741
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827948,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8880733944954129,
"acc_stderr": 0.013517352714958788,
"acc_norm": 0.8880733944954129,
"acc_norm_stderr": 0.013517352714958788
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5879629629629629,
"acc_stderr": 0.03356787758160831,
"acc_norm": 0.5879629629629629,
"acc_norm_stderr": 0.03356787758160831
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969427,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.869198312236287,
"acc_stderr": 0.02194876605947076,
"acc_norm": 0.869198312236287,
"acc_norm_stderr": 0.02194876605947076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.032785485373431386,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.032785485373431386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.02826881219254063,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.02826881219254063
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.03083349114628123,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.03083349114628123
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8492975734355045,
"acc_stderr": 0.01279342088312082,
"acc_norm": 0.8492975734355045,
"acc_norm_stderr": 0.01279342088312082
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7774566473988439,
"acc_stderr": 0.02239421566194282,
"acc_norm": 0.7774566473988439,
"acc_norm_stderr": 0.02239421566194282
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5050279329608939,
"acc_stderr": 0.01672165603753842,
"acc_norm": 0.5050279329608939,
"acc_norm_stderr": 0.01672165603753842
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7712418300653595,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.7712418300653595,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.02175186606081588,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.02175186606081588
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.02960991207559412,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.02960991207559412
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5508474576271186,
"acc_stderr": 0.012704030518851472,
"acc_norm": 0.5508474576271186,
"acc_norm_stderr": 0.012704030518851472
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103142,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146613,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146613
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7877551020408163,
"acc_stderr": 0.026176967197866764,
"acc_norm": 0.7877551020408163,
"acc_norm_stderr": 0.026176967197866764
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9104477611940298,
"acc_stderr": 0.02019067053502791,
"acc_norm": 0.9104477611940298,
"acc_norm_stderr": 0.02019067053502791
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646612,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646612
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5600273117673736,
"mc2_stderr": 0.014903116753397212
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
daveK91/kdom_classification | 2023-08-28T20:20:18.000Z | [
"region:us"
] | daveK91 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: pixel_values
sequence:
sequence:
sequence:
sequence: float32
- name: labels
dtype:
class_label:
names:
'0': Anderes
'1': Dreikoenigenschrein
'2': Gerokreuz
splits:
- name: train
num_bytes: 90724200.0
num_examples: 150
- name: test
num_bytes: 19354496.0
num_examples: 32
- name: validation
num_bytes: 16330356.0
num_examples: 27
download_size: 28668560
dataset_size: 126409052.0
---
# Dataset Card for "kdom_classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
stonet2000/stateexplore | 2023-08-28T23:23:33.000Z | [
"region:us"
] | stonet2000 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_zarakiquemparte__zaraxls-l2-7b | 2023-09-18T09:59:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of zarakiquemparte/zaraxls-l2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [zarakiquemparte/zaraxls-l2-7b](https://huggingface.co/zarakiquemparte/zaraxls-l2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_zarakiquemparte__zaraxls-l2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T09:59:01.594012](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zaraxls-l2-7b/blob/main/results_2023-09-18T09-59-01.594012.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2003984899328859,\n\
\ \"em_stderr\": 0.0040994319015717805,\n \"f1\": 0.3159343540268478,\n\
\ \"f1_stderr\": 0.004169444956344296,\n \"acc\": 0.36696200812243857,\n\
\ \"acc_stderr\": 0.006882749087214294\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.2003984899328859,\n \"em_stderr\": 0.0040994319015717805,\n\
\ \"f1\": 0.3159343540268478,\n \"f1_stderr\": 0.004169444956344296\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \
\ \"acc_stderr\": 0.0013121578148673923\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n\
\ }\n}\n```"
repo_url: https://huggingface.co/zarakiquemparte/zaraxls-l2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|arc:challenge|25_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T09_59_01.594012
path:
- '**/details_harness|drop|3_2023-09-18T09-59-01.594012.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T09-59-01.594012.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T09_59_01.594012
path:
- '**/details_harness|gsm8k|5_2023-09-18T09-59-01.594012.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T09-59-01.594012.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hellaswag|10_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T20:28:21.792080.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T20:28:21.792080.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T20:28:21.792080.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T09_59_01.594012
path:
- '**/details_harness|winogrande|5_2023-09-18T09-59-01.594012.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T09-59-01.594012.parquet'
- config_name: results
data_files:
- split: 2023_08_28T20_28_21.792080
path:
- results_2023-08-28T20:28:21.792080.parquet
- split: 2023_09_18T09_59_01.594012
path:
- results_2023-09-18T09-59-01.594012.parquet
- split: latest
path:
- results_2023-09-18T09-59-01.594012.parquet
---
# Dataset Card for Evaluation run of zarakiquemparte/zaraxls-l2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/zarakiquemparte/zaraxls-l2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [zarakiquemparte/zaraxls-l2-7b](https://huggingface.co/zarakiquemparte/zaraxls-l2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_zarakiquemparte__zaraxls-l2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T09:59:01.594012](https://huggingface.co/datasets/open-llm-leaderboard/details_zarakiquemparte__zaraxls-l2-7b/blob/main/results_2023-09-18T09-59-01.594012.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2003984899328859,
"em_stderr": 0.0040994319015717805,
"f1": 0.3159343540268478,
"f1_stderr": 0.004169444956344296,
"acc": 0.36696200812243857,
"acc_stderr": 0.006882749087214294
},
"harness|drop|3": {
"em": 0.2003984899328859,
"em_stderr": 0.0040994319015717805,
"f1": 0.3159343540268478,
"f1_stderr": 0.004169444956344296
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.0013121578148673923
},
"harness|winogrande|5": {
"acc": 0.7316495659037096,
"acc_stderr": 0.012453340359561195
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dwitidibyajyoti/layoutmlv3_v1 | 2023-08-28T20:29:30.000Z | [
"region:us"
] | dwitidibyajyoti | null | null | null | 0 | 0 | Entry not found |
darthlordvictor/generative-ai-dataset-002 | 2023-09-04T20:49:41.000Z | [
"region:us"
] | darthlordvictor | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: product_name
dtype: string
- name: product_description
dtype: string
- name: marketing_email
dtype: string
splits:
- name: train
num_bytes: 123258
num_examples: 99
download_size: 0
dataset_size: 123258
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "generative-ai-dataset-002"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
awettig/github-sample-65536tokens-llama | 2023-08-28T20:58:51.000Z | [
"region:us"
] | awettig | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 329257888
num_examples: 1256
download_size: 78876374
dataset_size: 329257888
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-sample-65536tokens-llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AntonioForte/t1 | 2023-08-28T21:04:02.000Z | [
"region:us"
] | AntonioForte | null | null | null | 0 | 0 | Entry not found |
LordTenson/test | 2023-08-28T21:55:38.000Z | [
"region:us"
] | LordTenson | null | null | null | 0 | 0 | Entry not found |
awettig/arxiv-sample-65536tokens-llama | 2023-08-28T21:18:46.000Z | [
"region:us"
] | awettig | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 128976816
num_examples: 492
download_size: 43974626
dataset_size: 128976816
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "arxiv-sample-65536tokens-llama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spsither/prepare_dataset_train_batch1 | 2023-08-29T05:29:58.000Z | [
"region:us"
] | spsither | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 95822010464
num_examples: 99760
download_size: 6089546764
dataset_size: 95822010464
---
# Dataset Card for "prepare_dataset_train_batch1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/tiny-codes-standardized | 2023-08-30T20:15:16.000Z | [
"region:us"
] | HydraLM | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 3678763115
num_examples: 3264618
download_size: 1264753822
dataset_size: 3678763115
---
# Dataset Card for "tiny-codes-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/Evol-Instruct-Code-80k-v1-standardized | 2023-08-30T20:28:05.000Z | [
"region:us"
] | HydraLM | null | null | null | 2 | 0 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 120580750
num_examples: 156528
download_size: 52351077
dataset_size: 120580750
---
# Dataset Card for "Evol-Instruct-Code-80k-v1-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HydraLM/instruct-python-500k-standardized | 2023-08-30T20:23:27.000Z | [
"region:us"
] | HydraLM | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
splits:
- name: train
num_bytes: 1010030074
num_examples: 1002698
download_size: 529792228
dataset_size: 1010030074
---
# Dataset Card for "instruct-python-500k-standardized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
na2s/nass2ss | 2023-08-28T22:01:34.000Z | [
"license:other",
"region:us"
] | na2s | null | null | null | 0 | 0 | ---
license: other
---
|
alx-ai/arxiv_papers | 2023-08-28T22:18:57.000Z | [
"region:us"
] | alx-ai | null | null | null | 0 | 0 | Entry not found |
spsither/prepare_dataset_train_batch0 | 2023-08-29T04:37:38.000Z | [
"region:us"
] | spsither | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 95821956240
num_examples: 99760
download_size: 156274877
dataset_size: 95821956240
---
# Dataset Card for "prepare_dataset_train_batch0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd2 | 2023-08-28T22:31:39.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2039406876
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd3 | 2023-08-28T22:41:27.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2051534214
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TFLai__Limarp-Platypus2-13B-QLoRA-0.80-epoch | 2023-08-28T22:41:42.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Limarp-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-28T22:39:43.026880](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Limarp-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A39%3A43.026880.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5661102173925912,\n \"\
acc_stderr\": 0.034269085107182864,\n \"acc_norm\": 0.5703008281999081,\n\
\ \"acc_norm_stderr\": 0.03424742968042107,\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.01611412415688245,\n \"mc2\": 0.44144767502452936,\n\
\ \"mc2_stderr\": 0.014651638696594051\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5631399317406144,\n \"acc_stderr\": 0.014494421584256517,\n\
\ \"acc_norm\": 0.6049488054607508,\n \"acc_norm_stderr\": 0.01428589829293817\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6221868153754232,\n\
\ \"acc_stderr\": 0.004838496966823907,\n \"acc_norm\": 0.8276239792869946,\n\
\ \"acc_norm_stderr\": 0.003769350079195889\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.0433913832257986,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.0433913832257986\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332786,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332786\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.026860206444724342,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.026860206444724342\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.39408866995073893,\n \"acc_stderr\": 0.034381579670365446,\n \"\
acc_norm\": 0.39408866995073893,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.696969696969697,\n \"acc_stderr\": 0.03588624800091706,\n\
\ \"acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.03588624800091706\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.02717121368316455,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.02717121368316455\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5358974358974359,\n \"acc_stderr\": 0.02528558599001784,\n \
\ \"acc_norm\": 0.5358974358974359,\n \"acc_norm_stderr\": 0.02528558599001784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.0284934650910286,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.0284934650910286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n\
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7669724770642202,\n \"acc_stderr\": 0.018125669180861514,\n \"\
acc_norm\": 0.7669724770642202,\n \"acc_norm_stderr\": 0.018125669180861514\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145635,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145635\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470022,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470022\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8247863247863247,\n\
\ \"acc_stderr\": 0.02490443909891823,\n \"acc_norm\": 0.8247863247863247,\n\
\ \"acc_norm_stderr\": 0.02490443909891823\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.015016884698539878,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.015016884698539878\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016124,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016124\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39217877094972065,\n\
\ \"acc_stderr\": 0.016329061073207446,\n \"acc_norm\": 0.39217877094972065,\n\
\ \"acc_norm_stderr\": 0.016329061073207446\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.027870745278290286,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.027870745278290286\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934016,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934016\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6234567901234568,\n \"acc_stderr\": 0.02695934451874778,\n\
\ \"acc_norm\": 0.6234567901234568,\n \"acc_norm_stderr\": 0.02695934451874778\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622864,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622864\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061177,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061177\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5918367346938775,\n \"acc_stderr\": 0.03146465712827424,\n\
\ \"acc_norm\": 0.5918367346938775,\n \"acc_norm_stderr\": 0.03146465712827424\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n\
\ \"mc1_stderr\": 0.01611412415688245,\n \"mc2\": 0.44144767502452936,\n\
\ \"mc2_stderr\": 0.014651638696594051\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:39:43.026880.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:39:43.026880.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:39:43.026880.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:39:43.026880.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_39_43.026880
path:
- results_2023-08-28T22:39:43.026880.parquet
- split: latest
path:
- results_2023-08-28T22:39:43.026880.parquet
---
# Dataset Card for Evaluation run of TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Limarp-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Limarp-Platypus2-13B-QLoRA-0.80-epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T22:39:43.026880](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Limarp-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A39%3A43.026880.json):
```python
{
"all": {
"acc": 0.5661102173925912,
"acc_stderr": 0.034269085107182864,
"acc_norm": 0.5703008281999081,
"acc_norm_stderr": 0.03424742968042107,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688245,
"mc2": 0.44144767502452936,
"mc2_stderr": 0.014651638696594051
},
"harness|arc:challenge|25": {
"acc": 0.5631399317406144,
"acc_stderr": 0.014494421584256517,
"acc_norm": 0.6049488054607508,
"acc_norm_stderr": 0.01428589829293817
},
"harness|hellaswag|10": {
"acc": 0.6221868153754232,
"acc_stderr": 0.004838496966823907,
"acc_norm": 0.8276239792869946,
"acc_norm_stderr": 0.003769350079195889
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.0433913832257986,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.0433913832257986
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332786,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332786
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724342,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724342
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39408866995073893,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.39408866995073893,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.03588624800091706,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.03588624800091706
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.02717121368316455,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.02717121368316455
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5358974358974359,
"acc_stderr": 0.02528558599001784,
"acc_norm": 0.5358974358974359,
"acc_norm_stderr": 0.02528558599001784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.0284934650910286,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.0284934650910286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.018125669180861514,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.018125669180861514
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145635,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145635
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470022,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470022
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8247863247863247,
"acc_stderr": 0.02490443909891823,
"acc_norm": 0.8247863247863247,
"acc_norm_stderr": 0.02490443909891823
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.015016884698539878,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.015016884698539878
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016124,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016124
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39217877094972065,
"acc_stderr": 0.016329061073207446,
"acc_norm": 0.39217877094972065,
"acc_norm_stderr": 0.016329061073207446
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.027870745278290286,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.027870745278290286
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934016,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934016
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6234567901234568,
"acc_stderr": 0.02695934451874778,
"acc_norm": 0.6234567901234568,
"acc_norm_stderr": 0.02695934451874778
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622864,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622864
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061177,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061177
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5918367346938775,
"acc_stderr": 0.03146465712827424,
"acc_norm": 0.5918367346938775,
"acc_norm_stderr": 0.03146465712827424
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.01611412415688245,
"mc2": 0.44144767502452936,
"mc2_stderr": 0.014651638696594051
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jxie/scanobjectnn | 2023-08-28T22:43:52.000Z | [
"region:us"
] | jxie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float32
- name: label
dtype: int64
splits:
- name: nobg_train
num_bytes: 75689020
num_examples: 2309
- name: nobg_test
num_bytes: 19045180
num_examples: 581
- name: bg_train
num_bytes: 75689020
num_examples: 2309
- name: bg_test
num_bytes: 19045180
num_examples: 581
- name: hardest_train
num_bytes: 374216480
num_examples: 11416
- name: hardest_test
num_bytes: 94471960
num_examples: 2882
download_size: 493795631
dataset_size: 658156840
---
# Dataset Card for "scanobjectnn"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch | 2023-08-28T22:46:41.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-28T22:44:43.350947](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A44%3A43.350947.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5679960716264059,\n \"\
acc_stderr\": 0.03434425016318727,\n \"acc_norm\": 0.5719670720271408,\n\
\ \"acc_norm_stderr\": 0.03432300537049293,\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5107941862859451,\n\
\ \"mc2_stderr\": 0.015778601018139424\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186043,\n\
\ \"acc_norm\": 0.5989761092150171,\n \"acc_norm_stderr\": 0.014322255790719867\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.632742481577375,\n\
\ \"acc_stderr\": 0.004810723108378215,\n \"acc_norm\": 0.8329018123879706,\n\
\ \"acc_norm_stderr\": 0.003723010745878392\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.03999309712777474,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.03999309712777474\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451232,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451232\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.03812400565974833,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.03812400565974833\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46382978723404256,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.46382978723404256,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3412698412698413,\n \"acc_stderr\": 0.02441923496681907,\n \"\
acc_norm\": 0.3412698412698413,\n \"acc_norm_stderr\": 0.02441923496681907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.39901477832512317,\n \"acc_stderr\": 0.034454876862647144,\n\
\ \"acc_norm\": 0.39901477832512317,\n \"acc_norm_stderr\": 0.034454876862647144\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.028697873971860688,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.028697873971860688\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5487179487179488,\n \"acc_stderr\": 0.025230381238934833,\n\
\ \"acc_norm\": 0.5487179487179488,\n \"acc_norm_stderr\": 0.025230381238934833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131126,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131126\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7761467889908257,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.7761467889908257,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.03372343271653064,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.03372343271653064\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.02933116229425174,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02933116229425174\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.04260735157644559,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.04260735157644559\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.0384985609879409,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.0384985609879409\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.01519047371703751,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.01519047371703751\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39776536312849164,\n\
\ \"acc_stderr\": 0.01636920497126298,\n \"acc_norm\": 0.39776536312849164,\n\
\ \"acc_norm_stderr\": 0.01636920497126298\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.565359477124183,\n \"acc_stderr\": 0.028384256704883037,\n\
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.028384256704883037\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.02667561192603709,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.02667561192603709\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787682,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787682\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5800653594771242,\n \"acc_stderr\": 0.01996681117825649,\n \
\ \"acc_norm\": 0.5800653594771242,\n \"acc_norm_stderr\": 0.01996681117825649\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.35128518971848227,\n\
\ \"mc1_stderr\": 0.016711358163544403,\n \"mc2\": 0.5107941862859451,\n\
\ \"mc2_stderr\": 0.015778601018139424\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:44:43.350947.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:44:43.350947.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:44:43.350947.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_44_43.350947
path:
- results_2023-08-28T22:44:43.350947.parquet
- split: latest
path:
- results_2023-08-28T22:44:43.350947.parquet
---
# Dataset Card for Evaluation run of TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T22:44:43.350947](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nous-Hermes-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A44%3A43.350947.json):
```python
{
"all": {
"acc": 0.5679960716264059,
"acc_stderr": 0.03434425016318727,
"acc_norm": 0.5719670720271408,
"acc_norm_stderr": 0.03432300537049293,
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5107941862859451,
"mc2_stderr": 0.015778601018139424
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186043,
"acc_norm": 0.5989761092150171,
"acc_norm_stderr": 0.014322255790719867
},
"harness|hellaswag|10": {
"acc": 0.632742481577375,
"acc_stderr": 0.004810723108378215,
"acc_norm": 0.8329018123879706,
"acc_norm_stderr": 0.003723010745878392
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.03999309712777474,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.03999309712777474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451232,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451232
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.03812400565974833,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.03812400565974833
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46382978723404256,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.46382978723404256,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.02441923496681907,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.02441923496681907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.39901477832512317,
"acc_stderr": 0.034454876862647144,
"acc_norm": 0.39901477832512317,
"acc_norm_stderr": 0.034454876862647144
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.028697873971860688,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.028697873971860688
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5487179487179488,
"acc_stderr": 0.025230381238934833,
"acc_norm": 0.5487179487179488,
"acc_norm_stderr": 0.025230381238934833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131126,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131126
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7761467889908257,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.7761467889908257,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.03372343271653064,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.03372343271653064
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02933116229425174,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02933116229425174
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.04260735157644559,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.04260735157644559
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.0384985609879409,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.0384985609879409
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.01519047371703751,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.01519047371703751
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39776536312849164,
"acc_stderr": 0.01636920497126298,
"acc_norm": 0.39776536312849164,
"acc_norm_stderr": 0.01636920497126298
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.028384256704883037,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.028384256704883037
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.02667561192603709,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.02667561192603709
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787682,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5800653594771242,
"acc_stderr": 0.01996681117825649,
"acc_norm": 0.5800653594771242,
"acc_norm_stderr": 0.01996681117825649
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.35128518971848227,
"mc1_stderr": 0.016711358163544403,
"mc2": 0.5107941862859451,
"mc2_stderr": 0.015778601018139424
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch | 2023-08-28T22:47:40.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-28T22:45:44.482040](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A45%3A44.482040.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5588882186227121,\n \"\
acc_stderr\": 0.034387754210310234,\n \"acc_norm\": 0.5629667416287707,\n\
\ \"acc_norm_stderr\": 0.03436555451614923,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5218086101134253,\n\
\ \"mc2_stderr\": 0.015699126036459794\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5665529010238908,\n \"acc_stderr\": 0.014481376224558902,\n\
\ \"acc_norm\": 0.6032423208191127,\n \"acc_norm_stderr\": 0.014296513020180637\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.633240390360486,\n\
\ \"acc_stderr\": 0.004809352075008934,\n \"acc_norm\": 0.8371838279227246,\n\
\ \"acc_norm_stderr\": 0.003684433323887794\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n \
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5967741935483871,\n\
\ \"acc_stderr\": 0.027906150826041146,\n \"acc_norm\": 0.5967741935483871,\n\
\ \"acc_norm_stderr\": 0.027906150826041146\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n\
\ \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7121212121212122,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.7121212121212122,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240644,\n\
\ \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240644\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.02889774874113114,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.02889774874113114\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.032145368597886394,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.032145368597886394\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416416,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416416\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7675606641123882,\n\
\ \"acc_stderr\": 0.015104550008905718,\n \"acc_norm\": 0.7675606641123882,\n\
\ \"acc_norm_stderr\": 0.015104550008905718\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.016635838341631917,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.016635838341631917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6045016077170418,\n\
\ \"acc_stderr\": 0.02777091853142784,\n \"acc_norm\": 0.6045016077170418,\n\
\ \"acc_norm_stderr\": 0.02777091853142784\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.02686949074481526,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.02686949074481526\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580214,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580214\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468317,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5964052287581699,\n \"acc_stderr\": 0.019848280168401147,\n \
\ \"acc_norm\": 0.5964052287581699,\n \"acc_norm_stderr\": 0.019848280168401147\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7213930348258707,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.7213930348258707,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932264\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5218086101134253,\n\
\ \"mc2_stderr\": 0.015699126036459794\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:45:44.482040.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:45:44.482040.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:45:44.482040.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_45_44.482040
path:
- results_2023-08-28T22:45:44.482040.parquet
- split: latest
path:
- results_2023-08-28T22:45:44.482040.parquet
---
# Dataset Card for Evaluation run of TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/MythoMix-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T22:45:44.482040](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__MythoMix-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A45%3A44.482040.json):
```python
{
"all": {
"acc": 0.5588882186227121,
"acc_stderr": 0.034387754210310234,
"acc_norm": 0.5629667416287707,
"acc_norm_stderr": 0.03436555451614923,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5218086101134253,
"mc2_stderr": 0.015699126036459794
},
"harness|arc:challenge|25": {
"acc": 0.5665529010238908,
"acc_stderr": 0.014481376224558902,
"acc_norm": 0.6032423208191127,
"acc_norm_stderr": 0.014296513020180637
},
"harness|hellaswag|10": {
"acc": 0.633240390360486,
"acc_stderr": 0.004809352075008934,
"acc_norm": 0.8371838279227246,
"acc_norm_stderr": 0.003684433323887794
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5967741935483871,
"acc_stderr": 0.027906150826041146,
"acc_norm": 0.5967741935483871,
"acc_norm_stderr": 0.027906150826041146
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7121212121212122,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.7121212121212122,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240644,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240644
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.02889774874113114,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.02889774874113114
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.032145368597886394,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.032145368597886394
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7675606641123882,
"acc_stderr": 0.015104550008905718,
"acc_norm": 0.7675606641123882,
"acc_norm_stderr": 0.015104550008905718
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.016635838341631917,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.016635838341631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6045016077170418,
"acc_stderr": 0.02777091853142784,
"acc_norm": 0.6045016077170418,
"acc_norm_stderr": 0.02777091853142784
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.02686949074481526,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.02686949074481526
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580214,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580214
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5964052287581699,
"acc_stderr": 0.019848280168401147,
"acc_norm": 0.5964052287581699,
"acc_norm_stderr": 0.019848280168401147
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7213930348258707,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.7213930348258707,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5218086101134253,
"mc2_stderr": 0.015699126036459794
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Danielbrdz__Barcenas-7b | 2023-09-17T23:34:19.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Danielbrdz/Barcenas-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Danielbrdz/Barcenas-7b](https://huggingface.co/Danielbrdz/Barcenas-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__Barcenas-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T23:34:07.541919](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-7b/blob/main/results_2023-09-17T23-34-07.541919.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004718959731543624,\n\
\ \"em_stderr\": 0.0007018360183131257,\n \"f1\": 0.0816715604026848,\n\
\ \"f1_stderr\": 0.0017762083839348887,\n \"acc\": 0.39889766050552516,\n\
\ \"acc_stderr\": 0.009497938418122394\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.004718959731543624,\n \"em_stderr\": 0.0007018360183131257,\n\
\ \"f1\": 0.0816715604026848,\n \"f1_stderr\": 0.0017762083839348887\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06141015921152388,\n \
\ \"acc_stderr\": 0.006613027536586322\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658464\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Danielbrdz/Barcenas-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T23_34_07.541919
path:
- '**/details_harness|drop|3_2023-09-17T23-34-07.541919.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T23-34-07.541919.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T23_34_07.541919
path:
- '**/details_harness|gsm8k|5_2023-09-17T23-34-07.541919.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T23-34-07.541919.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:47:45.353935.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:47:45.353935.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:47:45.353935.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T23_34_07.541919
path:
- '**/details_harness|winogrande|5_2023-09-17T23-34-07.541919.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T23-34-07.541919.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_47_45.353935
path:
- results_2023-08-28T22:47:45.353935.parquet
- split: 2023_09_17T23_34_07.541919
path:
- results_2023-09-17T23-34-07.541919.parquet
- split: latest
path:
- results_2023-09-17T23-34-07.541919.parquet
---
# Dataset Card for Evaluation run of Danielbrdz/Barcenas-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Danielbrdz/Barcenas-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Danielbrdz/Barcenas-7b](https://huggingface.co/Danielbrdz/Barcenas-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__Barcenas-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T23:34:07.541919](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__Barcenas-7b/blob/main/results_2023-09-17T23-34-07.541919.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131257,
"f1": 0.0816715604026848,
"f1_stderr": 0.0017762083839348887,
"acc": 0.39889766050552516,
"acc_stderr": 0.009497938418122394
},
"harness|drop|3": {
"em": 0.004718959731543624,
"em_stderr": 0.0007018360183131257,
"f1": 0.0816715604026848,
"f1_stderr": 0.0017762083839348887
},
"harness|gsm8k|5": {
"acc": 0.06141015921152388,
"acc_stderr": 0.006613027536586322
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658464
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch | 2023-08-28T22:52:28.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-28T22:50:32.447793](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A50%3A32.447793.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5944559513864108,\n \"\
acc_stderr\": 0.033969711872475335,\n \"acc_norm\": 0.5982758135590844,\n\
\ \"acc_norm_stderr\": 0.03394862862431821,\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.5220378272071609,\n\
\ \"mc2_stderr\": 0.015680700152502516\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268447,\n\
\ \"acc_norm\": 0.6237201365187713,\n \"acc_norm_stderr\": 0.014157022555407163\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6318462457677754,\n\
\ \"acc_stderr\": 0.004813177057496268,\n \"acc_norm\": 0.8299143596893049,\n\
\ \"acc_norm_stderr\": 0.003749401775087307\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n\
\ \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724356,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724356\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.03481904844438804,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.03481904844438804\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.033175059300091805,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.033175059300091805\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n\
\ \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.017324352325016012,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.017324352325016012\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240647,\n \"\
acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240647\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159267,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159267\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.040933292298342784,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.040933292298342784\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302873,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302873\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7854406130268199,\n\
\ \"acc_stderr\": 0.014680033956893346,\n \"acc_norm\": 0.7854406130268199,\n\
\ \"acc_norm_stderr\": 0.014680033956893346\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.02552247463212161,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.02552247463212161\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4581005586592179,\n\
\ \"acc_stderr\": 0.01666368329502052,\n \"acc_norm\": 0.4581005586592179,\n\
\ \"acc_norm_stderr\": 0.01666368329502052\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6559485530546624,\n\
\ \"acc_stderr\": 0.026981478043648036,\n \"acc_norm\": 0.6559485530546624,\n\
\ \"acc_norm_stderr\": 0.026981478043648036\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6851851851851852,\n \"acc_stderr\": 0.02584224870090217,\n\
\ \"acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.02584224870090217\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559811,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559811\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464622,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464622\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6143790849673203,\n \"acc_stderr\": 0.01969145905235403,\n \
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.01969145905235403\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982062,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982062\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36964504283965727,\n\
\ \"mc1_stderr\": 0.016898180706973888,\n \"mc2\": 0.5220378272071609,\n\
\ \"mc2_stderr\": 0.015680700152502516\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:50:32.447793.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:50:32.447793.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:50:32.447793.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_50_32.447793
path:
- results_2023-08-28T22:50:32.447793.parquet
- split: latest
path:
- results_2023-08-28T22:50:32.447793.parquet
---
# Dataset Card for Evaluation run of TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OpenOrca-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T22:50:32.447793](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OpenOrca-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A50%3A32.447793.json):
```python
{
"all": {
"acc": 0.5944559513864108,
"acc_stderr": 0.033969711872475335,
"acc_norm": 0.5982758135590844,
"acc_norm_stderr": 0.03394862862431821,
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.5220378272071609,
"mc2_stderr": 0.015680700152502516
},
"harness|arc:challenge|25": {
"acc": 0.5964163822525598,
"acc_stderr": 0.014337158914268447,
"acc_norm": 0.6237201365187713,
"acc_norm_stderr": 0.014157022555407163
},
"harness|hellaswag|10": {
"acc": 0.6318462457677754,
"acc_stderr": 0.004813177057496268,
"acc_norm": 0.8299143596893049,
"acc_norm_stderr": 0.003749401775087307
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724356,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724356
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.03481904844438804,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.03481904844438804
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.033175059300091805,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.033175059300091805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.017324352325016012,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.017324352325016012
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240647,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240647
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159267,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159267
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.040933292298342784,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.040933292298342784
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302873,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302873
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7854406130268199,
"acc_stderr": 0.014680033956893346,
"acc_norm": 0.7854406130268199,
"acc_norm_stderr": 0.014680033956893346
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.02552247463212161,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.02552247463212161
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4581005586592179,
"acc_stderr": 0.01666368329502052,
"acc_norm": 0.4581005586592179,
"acc_norm_stderr": 0.01666368329502052
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6559485530546624,
"acc_stderr": 0.026981478043648036,
"acc_norm": 0.6559485530546624,
"acc_norm_stderr": 0.026981478043648036
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.02584224870090217,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.02584224870090217
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559811,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559811
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464622,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464622
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.01969145905235403,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.01969145905235403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982062,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982062
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36964504283965727,
"mc1_stderr": 0.016898180706973888,
"mc2": 0.5220378272071609,
"mc2_stderr": 0.015680700152502516
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch | 2023-08-28T22:54:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-28T22:52:27.560095](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A52%3A27.560095.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5655100782367286,\n \"\
acc_stderr\": 0.03434599353980488,\n \"acc_norm\": 0.5693417551940863,\n\
\ \"acc_norm_stderr\": 0.034325756733802795,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5331823936755519,\n\
\ \"mc2_stderr\": 0.015786172341394154\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5810580204778157,\n \"acc_stderr\": 0.014418106953639013,\n\
\ \"acc_norm\": 0.6083617747440273,\n \"acc_norm_stderr\": 0.014264122124938217\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.626867157936666,\n\
\ \"acc_stderr\": 0.00482648558219101,\n \"acc_norm\": 0.8256323441545509,\n\
\ \"acc_norm_stderr\": 0.003786498856769124\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5962264150943396,\n \"acc_stderr\": 0.030197611600197946,\n\
\ \"acc_norm\": 0.5962264150943396,\n \"acc_norm_stderr\": 0.030197611600197946\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.625,\n\
\ \"acc_stderr\": 0.04048439222695598,\n \"acc_norm\": 0.625,\n \
\ \"acc_norm_stderr\": 0.04048439222695598\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.032662042990646775,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.032662042990646775\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523853,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523853\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557836,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557836\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6129032258064516,\n\
\ \"acc_stderr\": 0.027709359675032495,\n \"acc_norm\": 0.6129032258064516,\n\
\ \"acc_norm_stderr\": 0.027709359675032495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.031544498882702846,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.031544498882702846\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.03201650100739611,\n \
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.03201650100739611\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.781651376146789,\n \"acc_stderr\": 0.017712600528722727,\n \"\
acc_norm\": 0.781651376146789,\n \"acc_norm_stderr\": 0.017712600528722727\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.034028015813589656,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.034028015813589656\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159267,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159267\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285713,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285713\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8461538461538461,\n\
\ \"acc_stderr\": 0.023636873317489294,\n \"acc_norm\": 0.8461538461538461,\n\
\ \"acc_norm_stderr\": 0.023636873317489294\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7701149425287356,\n\
\ \"acc_stderr\": 0.015046301846691814,\n \"acc_norm\": 0.7701149425287356,\n\
\ \"acc_norm_stderr\": 0.015046301846691814\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.0261521986197268,\n\
\ \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.0261521986197268\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3474860335195531,\n\
\ \"acc_stderr\": 0.01592556406020815,\n \"acc_norm\": 0.3474860335195531,\n\
\ \"acc_norm_stderr\": 0.01592556406020815\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387292,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387292\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.639871382636656,\n\
\ \"acc_stderr\": 0.027264297599804012,\n \"acc_norm\": 0.639871382636656,\n\
\ \"acc_norm_stderr\": 0.027264297599804012\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719964,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719964\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.43617021276595747,\n \"acc_stderr\": 0.02958345203628407,\n \
\ \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.02958345203628407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4367666232073012,\n\
\ \"acc_stderr\": 0.012667701919603652,\n \"acc_norm\": 0.4367666232073012,\n\
\ \"acc_norm_stderr\": 0.012667701919603652\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.020017629214213087,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.020017629214213087\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907922,\n \"mc2\": 0.5331823936755519,\n\
\ \"mc2_stderr\": 0.015786172341394154\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:52:27.560095.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- results_2023-08-28T22:52:27.560095.parquet
- split: latest
path:
- results_2023-08-28T22:52:27.560095.parquet
---
# Dataset Card for Evaluation run of TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-28T22:52:27.560095](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-08-28T22%3A52%3A27.560095.json):
```python
{
"all": {
"acc": 0.5655100782367286,
"acc_stderr": 0.03434599353980488,
"acc_norm": 0.5693417551940863,
"acc_norm_stderr": 0.034325756733802795,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5331823936755519,
"mc2_stderr": 0.015786172341394154
},
"harness|arc:challenge|25": {
"acc": 0.5810580204778157,
"acc_stderr": 0.014418106953639013,
"acc_norm": 0.6083617747440273,
"acc_norm_stderr": 0.014264122124938217
},
"harness|hellaswag|10": {
"acc": 0.626867157936666,
"acc_stderr": 0.00482648558219101,
"acc_norm": 0.8256323441545509,
"acc_norm_stderr": 0.003786498856769124
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5962264150943396,
"acc_stderr": 0.030197611600197946,
"acc_norm": 0.5962264150943396,
"acc_norm_stderr": 0.030197611600197946
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.625,
"acc_stderr": 0.04048439222695598,
"acc_norm": 0.625,
"acc_norm_stderr": 0.04048439222695598
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.032662042990646775,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.032662042990646775
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523853,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557836,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557836
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6129032258064516,
"acc_stderr": 0.027709359675032495,
"acc_norm": 0.6129032258064516,
"acc_norm_stderr": 0.027709359675032495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.031544498882702846,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.031544498882702846
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.03201650100739611,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.03201650100739611
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.781651376146789,
"acc_stderr": 0.017712600528722727,
"acc_norm": 0.781651376146789,
"acc_norm_stderr": 0.017712600528722727
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.034028015813589656,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.034028015813589656
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159267,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159267
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285713,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285713
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8461538461538461,
"acc_stderr": 0.023636873317489294,
"acc_norm": 0.8461538461538461,
"acc_norm_stderr": 0.023636873317489294
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7701149425287356,
"acc_stderr": 0.015046301846691814,
"acc_norm": 0.7701149425287356,
"acc_norm_stderr": 0.015046301846691814
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.0261521986197268,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.0261521986197268
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3474860335195531,
"acc_stderr": 0.01592556406020815,
"acc_norm": 0.3474860335195531,
"acc_norm_stderr": 0.01592556406020815
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387292,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387292
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.639871382636656,
"acc_stderr": 0.027264297599804012,
"acc_norm": 0.639871382636656,
"acc_norm_stderr": 0.027264297599804012
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719964,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719964
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.02958345203628407,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.02958345203628407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4367666232073012,
"acc_stderr": 0.012667701919603652,
"acc_norm": 0.4367666232073012,
"acc_norm_stderr": 0.012667701919603652
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.020017629214213087,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.020017629214213087
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907922,
"mc2": 0.5331823936755519,
"mc2_stderr": 0.015786172341394154
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jason-lee08/TinyStories_Mars | 2023-08-28T22:55:17.000Z | [
"region:us"
] | jason-lee08 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
- name: stories
dtype: string
splits:
- name: train
num_bytes: 5569066
num_examples: 3364
download_size: 2292233
dataset_size: 5569066
---
# Dataset Card for "TinyStories_Mars"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TFLai__PuddleJumper-Platypus2-13B-QLoRA-0.80-epoch | 2023-08-28T22:57:06.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_TFLai__Stable-Platypus2-13B-QLoRA-0.80-epoch | 2023-08-28T22:58:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | Entry not found |
mshenoda/grand-piano | 2023-08-28T23:20:21.000Z | [
"license:creativeml-openrail-m",
"region:us"
] | mshenoda | null | null | null | 0 | 0 | ---
license: creativeml-openrail-m
---
|
open-llm-leaderboard/details_Sao10K__Medusa-13b | 2023-09-22T23:00:48.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Sao10K/Medusa-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Sao10K/Medusa-13b](https://huggingface.co/Sao10K/Medusa-13b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Medusa-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T23:00:36.340269](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-13b/blob/main/results_2023-09-22T23-00-36.340269.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08682885906040269,\n\
\ \"em_stderr\": 0.0028836847948924805,\n \"f1\": 0.20613359899328837,\n\
\ \"f1_stderr\": 0.003265939806465616,\n \"acc\": 0.4007308040520042,\n\
\ \"acc_stderr\": 0.009687702523105881\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08682885906040269,\n \"em_stderr\": 0.0028836847948924805,\n\
\ \"f1\": 0.20613359899328837,\n \"f1_stderr\": 0.003265939806465616\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \
\ \"acc_stderr\": 0.006945358944067429\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.012430046102144333\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Sao10K/Medusa-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|arc:challenge|25_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T23_00_36.340269
path:
- '**/details_harness|drop|3_2023-09-22T23-00-36.340269.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T23-00-36.340269.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T23_00_36.340269
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-00-36.340269.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T23-00-36.340269.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hellaswag|10_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T23:11:54.790657.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T23:11:54.790657.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T23_00_36.340269
path:
- '**/details_harness|winogrande|5_2023-09-22T23-00-36.340269.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T23-00-36.340269.parquet'
- config_name: results
data_files:
- split: 2023_08_28T23_11_54.790657
path:
- results_2023-08-28T23:11:54.790657.parquet
- split: 2023_09_22T23_00_36.340269
path:
- results_2023-09-22T23-00-36.340269.parquet
- split: latest
path:
- results_2023-09-22T23-00-36.340269.parquet
---
# Dataset Card for Evaluation run of Sao10K/Medusa-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Medusa-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Medusa-13b](https://huggingface.co/Sao10K/Medusa-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Medusa-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T23:00:36.340269](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Medusa-13b/blob/main/results_2023-09-22T23-00-36.340269.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08682885906040269,
"em_stderr": 0.0028836847948924805,
"f1": 0.20613359899328837,
"f1_stderr": 0.003265939806465616,
"acc": 0.4007308040520042,
"acc_stderr": 0.009687702523105881
},
"harness|drop|3": {
"em": 0.08682885906040269,
"em_stderr": 0.0028836847948924805,
"f1": 0.20613359899328837,
"f1_stderr": 0.003265939806465616
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067429
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.012430046102144333
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedConvo13bLoraE4 | 2023-09-13T00:02:32.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of NobodyExistsOnTheInternet/PuffedConvo13bLoraE4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NobodyExistsOnTheInternet/PuffedConvo13bLoraE4](https://huggingface.co/NobodyExistsOnTheInternet/PuffedConvo13bLoraE4)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedConvo13bLoraE4\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-13T00:01:07.493301](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedConvo13bLoraE4/blob/main/results_2023-09-13T00-01-07.493301.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5395636055450799,\n\
\ \"acc_stderr\": 0.03442786414821514,\n \"acc_norm\": 0.5433949771959586,\n\
\ \"acc_norm_stderr\": 0.03440597109326057,\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.3981838970380214,\n\
\ \"mc2_stderr\": 0.01504628923441546\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5588737201365188,\n \"acc_stderr\": 0.014509747749064663,\n\
\ \"acc_norm\": 0.5964163822525598,\n \"acc_norm_stderr\": 0.014337158914268441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.655148376817367,\n\
\ \"acc_stderr\": 0.004743484528346669,\n \"acc_norm\": 0.8436566421031667,\n\
\ \"acc_norm_stderr\": 0.003624383120823452\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45185185185185184,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.45185185185185184,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5622641509433962,\n \"acc_stderr\": 0.030533338430467516,\n\
\ \"acc_norm\": 0.5622641509433962,\n \"acc_norm_stderr\": 0.030533338430467516\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5416666666666666,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.5416666666666666,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.023517294335963286,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.023517294335963286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574924,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574924\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n\
\ \"acc_stderr\": 0.02762171783290703,\n \"acc_norm\": 0.6193548387096774,\n\
\ \"acc_norm_stderr\": 0.02762171783290703\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051448,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051448\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7875647668393783,\n \"acc_stderr\": 0.029519282616817234,\n\
\ \"acc_norm\": 0.7875647668393783,\n \"acc_norm_stderr\": 0.029519282616817234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4897435897435897,\n \"acc_stderr\": 0.025345672221942374,\n\
\ \"acc_norm\": 0.4897435897435897,\n \"acc_norm_stderr\": 0.025345672221942374\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.542016806722689,\n \"acc_stderr\": 0.03236361111951941,\n \
\ \"acc_norm\": 0.542016806722689,\n \"acc_norm_stderr\": 0.03236361111951941\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.726605504587156,\n \"acc_stderr\": 0.019109299846098278,\n \"\
acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.019109299846098278\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.031660096793998116,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035303,\n \
\ \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035303\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6547085201793722,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.6547085201793722,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935575,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935575\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.04582124160161549,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.04582124160161549\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7905982905982906,\n\
\ \"acc_stderr\": 0.02665569965392275,\n \"acc_norm\": 0.7905982905982906,\n\
\ \"acc_norm_stderr\": 0.02665569965392275\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7509578544061303,\n\
\ \"acc_stderr\": 0.015464676163395953,\n \"acc_norm\": 0.7509578544061303,\n\
\ \"acc_norm_stderr\": 0.015464676163395953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.026362437574546545,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.026362437574546545\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25921787709497207,\n\
\ \"acc_stderr\": 0.014655780837497735,\n \"acc_norm\": 0.25921787709497207,\n\
\ \"acc_norm_stderr\": 0.014655780837497735\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.02827549015679146,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.02827549015679146\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.026869490744815254,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.026869490744815254\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.39361702127659576,\n \"acc_stderr\": 0.029144544781596143,\n \
\ \"acc_norm\": 0.39361702127659576,\n \"acc_norm_stderr\": 0.029144544781596143\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41199478487614083,\n\
\ \"acc_stderr\": 0.01257087103214607,\n \"acc_norm\": 0.41199478487614083,\n\
\ \"acc_norm_stderr\": 0.01257087103214607\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5669934640522876,\n \"acc_stderr\": 0.020045442473324227,\n \
\ \"acc_norm\": 0.5669934640522876,\n \"acc_norm_stderr\": 0.020045442473324227\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6163265306122448,\n \"acc_stderr\": 0.031130880396235933,\n\
\ \"acc_norm\": 0.6163265306122448,\n \"acc_norm_stderr\": 0.031130880396235933\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916718,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916718\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117825,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117825\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2668298653610771,\n\
\ \"mc1_stderr\": 0.015483691939237265,\n \"mc2\": 0.3981838970380214,\n\
\ \"mc2_stderr\": 0.01504628923441546\n }\n}\n```"
repo_url: https://huggingface.co/NobodyExistsOnTheInternet/PuffedConvo13bLoraE4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|arc:challenge|25_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|arc:challenge|25_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hellaswag|10_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hellaswag|10_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:15:40.572782.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-01-07.493301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T00-01-07.493301.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T23:15:40.572782.parquet'
- split: 2023_09_13T00_01_07.493301
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T00-01-07.493301.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T00-01-07.493301.parquet'
- config_name: results
data_files:
- split: 2023_08_28T23_15_40.572782
path:
- results_2023-08-28T23:15:40.572782.parquet
- split: 2023_09_13T00_01_07.493301
path:
- results_2023-09-13T00-01-07.493301.parquet
- split: latest
path:
- results_2023-09-13T00-01-07.493301.parquet
---
# Dataset Card for Evaluation run of NobodyExistsOnTheInternet/PuffedConvo13bLoraE4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NobodyExistsOnTheInternet/PuffedConvo13bLoraE4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NobodyExistsOnTheInternet/PuffedConvo13bLoraE4](https://huggingface.co/NobodyExistsOnTheInternet/PuffedConvo13bLoraE4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedConvo13bLoraE4",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-13T00:01:07.493301](https://huggingface.co/datasets/open-llm-leaderboard/details_NobodyExistsOnTheInternet__PuffedConvo13bLoraE4/blob/main/results_2023-09-13T00-01-07.493301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5395636055450799,
"acc_stderr": 0.03442786414821514,
"acc_norm": 0.5433949771959586,
"acc_norm_stderr": 0.03440597109326057,
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.3981838970380214,
"mc2_stderr": 0.01504628923441546
},
"harness|arc:challenge|25": {
"acc": 0.5588737201365188,
"acc_stderr": 0.014509747749064663,
"acc_norm": 0.5964163822525598,
"acc_norm_stderr": 0.014337158914268441
},
"harness|hellaswag|10": {
"acc": 0.655148376817367,
"acc_stderr": 0.004743484528346669,
"acc_norm": 0.8436566421031667,
"acc_norm_stderr": 0.003624383120823452
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45185185185185184,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.45185185185185184,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5622641509433962,
"acc_stderr": 0.030533338430467516,
"acc_norm": 0.5622641509433962,
"acc_norm_stderr": 0.030533338430467516
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.0416656757710158,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.0416656757710158
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.023517294335963286,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.023517294335963286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574924,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574924
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.02762171783290703,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.02762171783290703
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051448,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051448
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7875647668393783,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.7875647668393783,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4897435897435897,
"acc_stderr": 0.025345672221942374,
"acc_norm": 0.4897435897435897,
"acc_norm_stderr": 0.025345672221942374
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.542016806722689,
"acc_stderr": 0.03236361111951941,
"acc_norm": 0.542016806722689,
"acc_norm_stderr": 0.03236361111951941
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.019109299846098278,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.019109299846098278
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035303,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035303
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6547085201793722,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.6547085201793722,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935575,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935575
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.04582124160161549,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.04582124160161549
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7905982905982906,
"acc_stderr": 0.02665569965392275,
"acc_norm": 0.7905982905982906,
"acc_norm_stderr": 0.02665569965392275
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7509578544061303,
"acc_stderr": 0.015464676163395953,
"acc_norm": 0.7509578544061303,
"acc_norm_stderr": 0.015464676163395953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.026362437574546545,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.026362437574546545
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25921787709497207,
"acc_stderr": 0.014655780837497735,
"acc_norm": 0.25921787709497207,
"acc_norm_stderr": 0.014655780837497735
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.02827549015679146,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.02827549015679146
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.026869490744815254,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.026869490744815254
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.39361702127659576,
"acc_stderr": 0.029144544781596143,
"acc_norm": 0.39361702127659576,
"acc_norm_stderr": 0.029144544781596143
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41199478487614083,
"acc_stderr": 0.01257087103214607,
"acc_norm": 0.41199478487614083,
"acc_norm_stderr": 0.01257087103214607
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5669934640522876,
"acc_stderr": 0.020045442473324227,
"acc_norm": 0.5669934640522876,
"acc_norm_stderr": 0.020045442473324227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6163265306122448,
"acc_stderr": 0.031130880396235933,
"acc_norm": 0.6163265306122448,
"acc_norm_stderr": 0.031130880396235933
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916718,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916718
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117825,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117825
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2668298653610771,
"mc1_stderr": 0.015483691939237265,
"mc2": 0.3981838970380214,
"mc2_stderr": 0.01504628923441546
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
jryan-pol/flags | 2023-10-06T12:38:26.000Z | [
"region:us"
] | jryan-pol | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data.csv
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_xxyyy123__test_orca_10k_v1_lora_gdu_safe | 2023-08-28T23:22:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | Entry not found |
pachequinho/twitter_airlines_pos_neg_small | 2023-08-29T00:04:31.000Z | [
"license:apache-2.0",
"region:us"
] | pachequinho | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
spsither/prepare_dataset_train_batch3 | 2023-08-29T03:00:35.000Z | [
"region:us"
] | spsither | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 95836096584
num_examples: 99761
download_size: 20673138275
dataset_size: 95836096584
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "prepare_dataset_train_batch3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xxyyy123__test_merge_p_ov1_w0.66_w0.5_n1 | 2023-08-29T00:09:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1](https://huggingface.co/xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xxyyy123__test_merge_p_ov1_w0.66_w0.5_n1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T00:08:25.370219](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test_merge_p_ov1_w0.66_w0.5_n1/blob/main/results_2023-08-29T00%3A08%3A25.370219.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5815765512888643,\n \"\
acc_stderr\": 0.034085704613422384,\n \"acc_norm\": 0.5853409830541163,\n\
\ \"acc_norm_stderr\": 0.03406560528917499,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5617854888760936,\n\
\ \"mc2_stderr\": 0.015820333707933832\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6015358361774744,\n \"acc_stderr\": 0.014306946052735567,\n\
\ \"acc_norm\": 0.6245733788395904,\n \"acc_norm_stderr\": 0.014150631435111728\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6246763592909779,\n\
\ \"acc_stderr\": 0.0048321678545016405,\n \"acc_norm\": 0.8237402907787293,\n\
\ \"acc_norm_stderr\": 0.003802622341529012\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.0404633688397825,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.0404633688397825\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n\
\ \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n\
\ \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518025,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518025\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"\
acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.02606936229533513,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.02606936229533513\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\"\
: 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.5840336134453782,\n \"acc_stderr\": 0.032016501007396114,\n\
\ \"acc_norm\": 0.5840336134453782,\n \"acc_norm_stderr\": 0.032016501007396114\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"\
acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4212962962962963,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.4212962962962963,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806297,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806297\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.015016884698539873,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.015016884698539873\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.0162690886639594,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.0162690886639594\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829028,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829028\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776165,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776165\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4432624113475177,\n \"acc_stderr\": 0.029634838473766,\n \
\ \"acc_norm\": 0.4432624113475177,\n \"acc_norm_stderr\": 0.029634838473766\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455326,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455326\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105935,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105935\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n\
\ \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693657,\n \"mc2\": 0.5617854888760936,\n\
\ \"mc2_stderr\": 0.015820333707933832\n }\n}\n```"
repo_url: https://huggingface.co/xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:08:25.370219.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:08:25.370219.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:08:25.370219.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:08:25.370219.parquet'
- config_name: results
data_files:
- split: 2023_08_29T00_08_25.370219
path:
- results_2023-08-29T00:08:25.370219.parquet
- split: latest
path:
- results_2023-08-29T00:08:25.370219.parquet
---
# Dataset Card for Evaluation run of xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1](https://huggingface.co/xxyyy123/test_merge_p_ov1_w0.66_w0.5_n1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xxyyy123__test_merge_p_ov1_w0.66_w0.5_n1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T00:08:25.370219](https://huggingface.co/datasets/open-llm-leaderboard/details_xxyyy123__test_merge_p_ov1_w0.66_w0.5_n1/blob/main/results_2023-08-29T00%3A08%3A25.370219.json):
```python
{
"all": {
"acc": 0.5815765512888643,
"acc_stderr": 0.034085704613422384,
"acc_norm": 0.5853409830541163,
"acc_norm_stderr": 0.03406560528917499,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5617854888760936,
"mc2_stderr": 0.015820333707933832
},
"harness|arc:challenge|25": {
"acc": 0.6015358361774744,
"acc_stderr": 0.014306946052735567,
"acc_norm": 0.6245733788395904,
"acc_norm_stderr": 0.014150631435111728
},
"harness|hellaswag|10": {
"acc": 0.6246763592909779,
"acc_stderr": 0.0048321678545016405,
"acc_norm": 0.8237402907787293,
"acc_norm_stderr": 0.003802622341529012
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.0404633688397825,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.0404633688397825
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518025,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518025
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.02606936229533513,
"acc_norm": 0.7,
"acc_norm_stderr": 0.02606936229533513
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5840336134453782,
"acc_stderr": 0.032016501007396114,
"acc_norm": 0.5840336134453782,
"acc_norm_stderr": 0.032016501007396114
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4212962962962963,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.4212962962962963,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806297,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806297
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.015016884698539873,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.015016884698539873
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.0162690886639594,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.0162690886639594
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829028,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829028
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776165,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776165
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4432624113475177,
"acc_stderr": 0.029634838473766,
"acc_norm": 0.4432624113475177,
"acc_norm_stderr": 0.029634838473766
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455326,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455326
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105935,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105935
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693657,
"mc2": 0.5617854888760936,
"mc2_stderr": 0.015820333707933832
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pachequinho/restaurant_reviews | 2023-08-29T00:27:03.000Z | [
"license:apache-2.0",
"region:us"
] | pachequinho | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Writer__palmyra-large | 2023-08-29T00:25:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Writer/palmyra-large
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Writer/palmyra-large](https://huggingface.co/Writer/palmyra-large) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Writer__palmyra-large\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T00:23:42.233683](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-large/blob/main/results_2023-08-29T00%3A23%3A42.233683.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.29181612856230493,\n \"\
acc_stderr\": 0.03281505101765732,\n \"acc_norm\": 0.295503113290656,\n \
\ \"acc_norm_stderr\": 0.03280873704113449,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.3592861283297082,\n\
\ \"mc2_stderr\": 0.013480719092296862\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4189419795221843,\n \"acc_stderr\": 0.014418106953639015,\n\
\ \"acc_norm\": 0.4496587030716723,\n \"acc_norm_stderr\": 0.014537144444284736\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5316669986058554,\n\
\ \"acc_stderr\": 0.004979763862134998,\n \"acc_norm\": 0.7184823740290779,\n\
\ \"acc_norm_stderr\": 0.004488201756642573\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3179190751445087,\n\
\ \"acc_stderr\": 0.03550683989165582,\n \"acc_norm\": 0.3179190751445087,\n\
\ \"acc_norm_stderr\": 0.03550683989165582\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03708284662416544,\n\
\ \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03708284662416544\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200217,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200217\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438015,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438015\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.02256989707491841,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02256989707491841\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2645161290322581,\n \"acc_stderr\": 0.02509189237885928,\n \"\
acc_norm\": 0.2645161290322581,\n \"acc_norm_stderr\": 0.02509189237885928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.19704433497536947,\n \"acc_stderr\": 0.027986724666736226,\n \"\
acc_norm\": 0.19704433497536947,\n \"acc_norm_stderr\": 0.027986724666736226\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300992,\n \"\
acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300992\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32642487046632124,\n \"acc_stderr\": 0.033840286211432945,\n\
\ \"acc_norm\": 0.32642487046632124,\n \"acc_norm_stderr\": 0.033840286211432945\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24358974358974358,\n \"acc_stderr\": 0.021763733684173923,\n\
\ \"acc_norm\": 0.24358974358974358,\n \"acc_norm_stderr\": 0.021763733684173923\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.02606715922227579,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.02606715922227579\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28256880733944956,\n \"acc_stderr\": 0.01930424349770715,\n \"\
acc_norm\": 0.28256880733944956,\n \"acc_norm_stderr\": 0.01930424349770715\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046982,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046982\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.31862745098039214,\n \"acc_stderr\": 0.0327028718148208,\n \"\
acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.0327028718148208\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.33755274261603374,\n \"acc_stderr\": 0.030781549102026216,\n \
\ \"acc_norm\": 0.33755274261603374,\n \"acc_norm_stderr\": 0.030781549102026216\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.3811659192825112,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3435114503816794,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.3435114503816794,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3884297520661157,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.3884297520661157,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.32051282051282054,\n\
\ \"acc_stderr\": 0.030572811310299604,\n \"acc_norm\": 0.32051282051282054,\n\
\ \"acc_norm_stderr\": 0.030572811310299604\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.33205619412515963,\n\
\ \"acc_stderr\": 0.016841174655295728,\n \"acc_norm\": 0.33205619412515963,\n\
\ \"acc_norm_stderr\": 0.016841174655295728\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3006535947712418,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.3006535947712418,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3183279742765273,\n\
\ \"acc_stderr\": 0.026457225067811032,\n \"acc_norm\": 0.3183279742765273,\n\
\ \"acc_norm_stderr\": 0.026457225067811032\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.025842248700902164,\n\
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.025842248700902164\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.26595744680851063,\n \"acc_stderr\": 0.026358065698880585,\n \
\ \"acc_norm\": 0.26595744680851063,\n \"acc_norm_stderr\": 0.026358065698880585\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.29465449804432853,\n\
\ \"acc_stderr\": 0.01164357676406955,\n \"acc_norm\": 0.29465449804432853,\n\
\ \"acc_norm_stderr\": 0.01164357676406955\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142314,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2908496732026144,\n \"acc_stderr\": 0.018373116915903966,\n \
\ \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.018373116915903966\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.04122066502878284,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.04122066502878284\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27346938775510204,\n \"acc_stderr\": 0.028535560337128438,\n\
\ \"acc_norm\": 0.27346938775510204,\n \"acc_norm_stderr\": 0.028535560337128438\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3034825870646766,\n\
\ \"acc_stderr\": 0.032510068164586174,\n \"acc_norm\": 0.3034825870646766,\n\
\ \"acc_norm_stderr\": 0.032510068164586174\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n\
\ \"acc_stderr\": 0.0357160923005348,\n \"acc_norm\": 0.30120481927710846,\n\
\ \"acc_norm_stderr\": 0.0357160923005348\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.036155076303109344,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.036155076303109344\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301137,\n \"mc2\": 0.3592861283297082,\n\
\ \"mc2_stderr\": 0.013480719092296862\n }\n}\n```"
repo_url: https://huggingface.co/Writer/palmyra-large
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:23:42.233683.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:23:42.233683.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:23:42.233683.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:23:42.233683.parquet'
- config_name: results
data_files:
- split: 2023_08_29T00_23_42.233683
path:
- results_2023-08-29T00:23:42.233683.parquet
- split: latest
path:
- results_2023-08-29T00:23:42.233683.parquet
---
# Dataset Card for Evaluation run of Writer/palmyra-large
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Writer/palmyra-large
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Writer/palmyra-large](https://huggingface.co/Writer/palmyra-large) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Writer__palmyra-large",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T00:23:42.233683](https://huggingface.co/datasets/open-llm-leaderboard/details_Writer__palmyra-large/blob/main/results_2023-08-29T00%3A23%3A42.233683.json):
```python
{
"all": {
"acc": 0.29181612856230493,
"acc_stderr": 0.03281505101765732,
"acc_norm": 0.295503113290656,
"acc_norm_stderr": 0.03280873704113449,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.3592861283297082,
"mc2_stderr": 0.013480719092296862
},
"harness|arc:challenge|25": {
"acc": 0.4189419795221843,
"acc_stderr": 0.014418106953639015,
"acc_norm": 0.4496587030716723,
"acc_norm_stderr": 0.014537144444284736
},
"harness|hellaswag|10": {
"acc": 0.5316669986058554,
"acc_stderr": 0.004979763862134998,
"acc_norm": 0.7184823740290779,
"acc_norm_stderr": 0.004488201756642573
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3125,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3179190751445087,
"acc_stderr": 0.03550683989165582,
"acc_norm": 0.3179190751445087,
"acc_norm_stderr": 0.03550683989165582
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03708284662416544,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03708284662416544
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200217,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200217
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02256989707491841,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02256989707491841
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2645161290322581,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.2645161290322581,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.19704433497536947,
"acc_stderr": 0.027986724666736226,
"acc_norm": 0.19704433497536947,
"acc_norm_stderr": 0.027986724666736226
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300992,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300992
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32642487046632124,
"acc_stderr": 0.033840286211432945,
"acc_norm": 0.32642487046632124,
"acc_norm_stderr": 0.033840286211432945
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.021763733684173923,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.021763733684173923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.02606715922227579,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.02606715922227579
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28256880733944956,
"acc_stderr": 0.01930424349770715,
"acc_norm": 0.28256880733944956,
"acc_norm_stderr": 0.01930424349770715
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046982,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046982
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.31862745098039214,
"acc_stderr": 0.0327028718148208,
"acc_norm": 0.31862745098039214,
"acc_norm_stderr": 0.0327028718148208
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.33755274261603374,
"acc_stderr": 0.030781549102026216,
"acc_norm": 0.33755274261603374,
"acc_norm_stderr": 0.030781549102026216
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3435114503816794,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.3435114503816794,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3884297520661157,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.3884297520661157,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.32051282051282054,
"acc_stderr": 0.030572811310299604,
"acc_norm": 0.32051282051282054,
"acc_norm_stderr": 0.030572811310299604
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.33205619412515963,
"acc_stderr": 0.016841174655295728,
"acc_norm": 0.33205619412515963,
"acc_norm_stderr": 0.016841174655295728
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3006535947712418,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.3006535947712418,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3183279742765273,
"acc_stderr": 0.026457225067811032,
"acc_norm": 0.3183279742765273,
"acc_norm_stderr": 0.026457225067811032
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.025842248700902164,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.025842248700902164
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.26595744680851063,
"acc_stderr": 0.026358065698880585,
"acc_norm": 0.26595744680851063,
"acc_norm_stderr": 0.026358065698880585
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.29465449804432853,
"acc_stderr": 0.01164357676406955,
"acc_norm": 0.29465449804432853,
"acc_norm_stderr": 0.01164357676406955
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142314,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.018373116915903966,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.018373116915903966
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878284,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878284
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27346938775510204,
"acc_stderr": 0.028535560337128438,
"acc_norm": 0.27346938775510204,
"acc_norm_stderr": 0.028535560337128438
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3034825870646766,
"acc_stderr": 0.032510068164586174,
"acc_norm": 0.3034825870646766,
"acc_norm_stderr": 0.032510068164586174
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.0357160923005348,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.0357160923005348
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.036155076303109344,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.036155076303109344
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301137,
"mc2": 0.3592861283297082,
"mc2_stderr": 0.013480719092296862
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ElMerOs/Prueba | 2023-08-29T00:43:25.000Z | [
"license:openrail",
"region:us"
] | ElMerOs | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b | 2023-08-29T00:47:23.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of radm/Philosophy-Platypus2-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [radm/Philosophy-Platypus2-13b](https://huggingface.co/radm/Philosophy-Platypus2-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T00:45:24.163346](https://huggingface.co/datasets/open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b/blob/main/results_2023-08-29T00%3A45%3A24.163346.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.5437981691869808,\n \"\
acc_stderr\": 0.03484311795554624,\n \"acc_norm\": 0.547878610439407,\n \
\ \"acc_norm_stderr\": 0.034826606717822575,\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.37335488461829447,\n\
\ \"mc2_stderr\": 0.014112790281285795\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633822,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221004\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5828520215096594,\n\
\ \"acc_stderr\": 0.004920800313232742,\n \"acc_norm\": 0.785202150965943,\n\
\ \"acc_norm_stderr\": 0.0040984271589492634\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.03807301726504511,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.03807301726504511\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4340425531914894,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.4340425531914894,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.042663394431593935,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.042663394431593935\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6645161290322581,\n\
\ \"acc_stderr\": 0.026860206444724352,\n \"acc_norm\": 0.6645161290322581,\n\
\ \"acc_norm_stderr\": 0.026860206444724352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.03465304488406796,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.03465304488406796\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.03646204963253812,\n\
\ \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.03646204963253812\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7323232323232324,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.7323232323232324,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n\
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n\
\ \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633146,\n \"\
acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633146\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4351851851851852,\n \"acc_stderr\": 0.03381200005643525,\n \"\
acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.03381200005643525\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.02917868230484253,\n\
\ \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.02917868230484253\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5950920245398773,\n \"acc_stderr\": 0.03856672163548913,\n\
\ \"acc_norm\": 0.5950920245398773,\n \"acc_norm_stderr\": 0.03856672163548913\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291518,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291518\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503947,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503947\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7008547008547008,\n\
\ \"acc_stderr\": 0.02999695185834949,\n \"acc_norm\": 0.7008547008547008,\n\
\ \"acc_norm_stderr\": 0.02999695185834949\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7420178799489144,\n\
\ \"acc_stderr\": 0.01564583018834895,\n \"acc_norm\": 0.7420178799489144,\n\
\ \"acc_norm_stderr\": 0.01564583018834895\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643637,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643637\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3463687150837989,\n\
\ \"acc_stderr\": 0.015913546784020117,\n \"acc_norm\": 0.3463687150837989,\n\
\ \"acc_norm_stderr\": 0.015913546784020117\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387303,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387303\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192707,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.02672586880910079,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.02672586880910079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41843971631205673,\n \"acc_stderr\": 0.029427994039419994,\n \
\ \"acc_norm\": 0.41843971631205673,\n \"acc_norm_stderr\": 0.029427994039419994\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40352020860495436,\n\
\ \"acc_stderr\": 0.012530241301193186,\n \"acc_norm\": 0.40352020860495436,\n\
\ \"acc_norm_stderr\": 0.012530241301193186\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307293,\n \
\ \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307293\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.03141470802586589,\n\
\ \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.03141470802586589\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24357405140758873,\n\
\ \"mc1_stderr\": 0.015026354824910782,\n \"mc2\": 0.37335488461829447,\n\
\ \"mc2_stderr\": 0.014112790281285795\n }\n}\n```"
repo_url: https://huggingface.co/radm/Philosophy-Platypus2-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T00:45:24.163346.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:45:24.163346.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T00:45:24.163346.parquet'
- config_name: results
data_files:
- split: 2023_08_29T00_45_24.163346
path:
- results_2023-08-29T00:45:24.163346.parquet
- split: latest
path:
- results_2023-08-29T00:45:24.163346.parquet
---
# Dataset Card for Evaluation run of radm/Philosophy-Platypus2-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/radm/Philosophy-Platypus2-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [radm/Philosophy-Platypus2-13b](https://huggingface.co/radm/Philosophy-Platypus2-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T00:45:24.163346](https://huggingface.co/datasets/open-llm-leaderboard/details_radm__Philosophy-Platypus2-13b/blob/main/results_2023-08-29T00%3A45%3A24.163346.json):
```python
{
"all": {
"acc": 0.5437981691869808,
"acc_stderr": 0.03484311795554624,
"acc_norm": 0.547878610439407,
"acc_norm_stderr": 0.034826606717822575,
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.37335488461829447,
"mc2_stderr": 0.014112790281285795
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633822,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221004
},
"harness|hellaswag|10": {
"acc": 0.5828520215096594,
"acc_stderr": 0.004920800313232742,
"acc_norm": 0.785202150965943,
"acc_norm_stderr": 0.0040984271589492634
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.03807301726504511,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.03807301726504511
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4340425531914894,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.4340425531914894,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.042663394431593935,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.042663394431593935
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724352,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.03465304488406796,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.03465304488406796
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.03646204963253812,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.03646204963253812
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7323232323232324,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.7323232323232324,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633146,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633146
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.03381200005643525,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.03381200005643525
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.02917868230484253,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.02917868230484253
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5950920245398773,
"acc_stderr": 0.03856672163548913,
"acc_norm": 0.5950920245398773,
"acc_norm_stderr": 0.03856672163548913
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291518,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291518
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503947,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503947
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7008547008547008,
"acc_stderr": 0.02999695185834949,
"acc_norm": 0.7008547008547008,
"acc_norm_stderr": 0.02999695185834949
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7420178799489144,
"acc_stderr": 0.01564583018834895,
"acc_norm": 0.7420178799489144,
"acc_norm_stderr": 0.01564583018834895
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643637,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643637
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3463687150837989,
"acc_stderr": 0.015913546784020117,
"acc_norm": 0.3463687150837989,
"acc_norm_stderr": 0.015913546784020117
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387303,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192707,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.02672586880910079,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.02672586880910079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41843971631205673,
"acc_stderr": 0.029427994039419994,
"acc_norm": 0.41843971631205673,
"acc_norm_stderr": 0.029427994039419994
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40352020860495436,
"acc_stderr": 0.012530241301193186,
"acc_norm": 0.40352020860495436,
"acc_norm_stderr": 0.012530241301193186
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307293,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307293
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.03141470802586589,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.03141470802586589
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24357405140758873,
"mc1_stderr": 0.015026354824910782,
"mc2": 0.37335488461829447,
"mc2_stderr": 0.014112790281285795
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ajoshi-6/cleaned_disaster | 2023-08-29T01:02:11.000Z | [
"region:us"
] | ajoshi-6 | null | null | null | 0 | 0 | Entry not found |
thomaslu/articulationGAN_finetuning_data | 2023-08-29T01:12:13.000Z | [
"region:us"
] | thomaslu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 9100963.0
num_examples: 111
- name: train
num_bytes: 32796309.0
num_examples: 400
download_size: 41933143
dataset_size: 41897272.0
---
# Dataset Card for "articulationGAN_finetuning_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spsither/prepare_dataset_train_batch2 | 2023-08-29T06:23:03.000Z | [
"region:us"
] | spsither | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 95846087800
num_examples: 99760
download_size: 5077027527
dataset_size: 95846087800
---
# Dataset Card for "prepare_dataset_train_batch2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_MagicTelescope_gosdt_l512_d3 | 2023-08-29T01:35:00.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 2607095699
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_MagicTelescope_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd1 | 2023-08-29T01:47:22.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 10863200000
num_examples: 100000
- name: validation
num_bytes: 1086320000
num_examples: 10000
download_size: 2039303021
dataset_size: 11949520000
---
# Dataset Card for "autotree_automl_default-of-credit-card-clients_gosdt_l512_d3_sd1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JackieZhang/chipgpt | 2023-08-30T02:51:48.000Z | [
"task_categories:text-generation",
"task_categories:conversational",
"size_categories:n<1K",
"language:en",
"license:apache-2.0",
"code",
"region:us"
] | JackieZhang | null | null | null | 0 | 0 | ---
license: apache-2.0
task_categories:
- text-generation
- conversational
language:
- en
tags:
- code
size_categories:
- n<1K
--- |
Goodevile/OCR-Grayscale-6Alphanumerical | 2023-08-29T02:13:56.000Z | [
"region:us"
] | Goodevile | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-es-fns | 2023-08-29T04:06:11.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
splits:
- name: test
num_bytes: 20134903
num_examples: 50
download_size: 9992059
dataset_size: 20134903
---
# Dataset Card for "flare-es-fns"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChanceFocus/flare-es-multifin | 2023-08-29T02:31:46.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: 'query:'
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
- name: query
dtype: string
splits:
- name: test
num_bytes: 170917
num_examples: 230
download_size: 39876
dataset_size: 170917
---
# Dataset Card for "flare-es-multifin"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_chargoddard__llama-2-34b-uncode | 2023-08-29T02:24:21.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of chargoddard/llama-2-34b-uncode
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [chargoddard/llama-2-34b-uncode](https://huggingface.co/chargoddard/llama-2-34b-uncode)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_chargoddard__llama-2-34b-uncode\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-29T02:22:47.016201](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-34b-uncode/blob/main/results_2023-08-29T02%3A22%3A47.016201.json):\n\
\n```python\n{\n \"all\": {\n \"acc\": 0.3830479829443808,\n \"\
acc_stderr\": 0.03466690310283795,\n \"acc_norm\": 0.3842963264119114,\n\
\ \"acc_norm_stderr\": 0.03467326802571514,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520688,\n \"mc2\": 0.4094051732845386,\n\
\ \"mc2_stderr\": 0.014058890306038239\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.36860068259385664,\n \"acc_stderr\": 0.014097810678042187,\n\
\ \"acc_norm\": 0.39505119453924914,\n \"acc_norm_stderr\": 0.014285898292938167\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.29177454690300736,\n\
\ \"acc_stderr\": 0.004536500714147978,\n \"acc_norm\": 0.33897629954192393,\n\
\ \"acc_norm_stderr\": 0.00472394354900599\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.27631578947368424,\n \"acc_stderr\": 0.03639057569952925,\n\
\ \"acc_norm\": 0.27631578947368424,\n \"acc_norm_stderr\": 0.03639057569952925\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4188679245283019,\n \"acc_stderr\": 0.030365050829115205,\n\
\ \"acc_norm\": 0.4188679245283019,\n \"acc_norm_stderr\": 0.030365050829115205\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.42196531791907516,\n\
\ \"acc_stderr\": 0.03765746693865151,\n \"acc_norm\": 0.42196531791907516,\n\
\ \"acc_norm_stderr\": 0.03765746693865151\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281335,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281335\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.30344827586206896,\n \"acc_stderr\": 0.038312260488503336,\n\
\ \"acc_norm\": 0.30344827586206896,\n \"acc_norm_stderr\": 0.038312260488503336\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403325,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403325\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.432258064516129,\n\
\ \"acc_stderr\": 0.028181739720019406,\n \"acc_norm\": 0.432258064516129,\n\
\ \"acc_norm_stderr\": 0.028181739720019406\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3793103448275862,\n \"acc_stderr\": 0.03413963805906235,\n\
\ \"acc_norm\": 0.3793103448275862,\n \"acc_norm_stderr\": 0.03413963805906235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5151515151515151,\n \"acc_stderr\": 0.03560716516531061,\n \"\
acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03560716516531061\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5181347150259067,\n \"acc_stderr\": 0.036060650018329185,\n\
\ \"acc_norm\": 0.5181347150259067,\n \"acc_norm_stderr\": 0.036060650018329185\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.382051282051282,\n \"acc_stderr\": 0.02463554916390823,\n \
\ \"acc_norm\": 0.382051282051282,\n \"acc_norm_stderr\": 0.02463554916390823\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.40756302521008403,\n \"acc_stderr\": 0.03191863374478466,\n\
\ \"acc_norm\": 0.40756302521008403,\n \"acc_norm_stderr\": 0.03191863374478466\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.44036697247706424,\n \"acc_stderr\": 0.021284310623761543,\n \"\
acc_norm\": 0.44036697247706424,\n \"acc_norm_stderr\": 0.021284310623761543\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437294,\n \"\
acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437294\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.29901960784313725,\n \"acc_stderr\": 0.03213325717373617,\n \"\
acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.03213325717373617\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.37130801687763715,\n \"acc_stderr\": 0.03145068600744858,\n \
\ \"acc_norm\": 0.37130801687763715,\n \"acc_norm_stderr\": 0.03145068600744858\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928276,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.40458015267175573,\n \"acc_stderr\": 0.043046937953806645,\n\
\ \"acc_norm\": 0.40458015267175573,\n \"acc_norm_stderr\": 0.043046937953806645\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831029,\n\
\ \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831029\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6025641025641025,\n\
\ \"acc_stderr\": 0.03205953453789293,\n \"acc_norm\": 0.6025641025641025,\n\
\ \"acc_norm_stderr\": 0.03205953453789293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5491698595146871,\n\
\ \"acc_stderr\": 0.017793297572699037,\n \"acc_norm\": 0.5491698595146871,\n\
\ \"acc_norm_stderr\": 0.017793297572699037\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3959537572254335,\n \"acc_stderr\": 0.026329813341946243,\n\
\ \"acc_norm\": 0.3959537572254335,\n \"acc_norm_stderr\": 0.026329813341946243\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925312,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925312\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.027956046165424502,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.027956046165424502\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4887459807073955,\n\
\ \"acc_stderr\": 0.028390897396863526,\n \"acc_norm\": 0.4887459807073955,\n\
\ \"acc_norm_stderr\": 0.028390897396863526\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4228395061728395,\n \"acc_stderr\": 0.027487472980871598,\n\
\ \"acc_norm\": 0.4228395061728395,\n \"acc_norm_stderr\": 0.027487472980871598\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.02612957252718085,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.02612957252718085\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27640156453715775,\n\
\ \"acc_stderr\": 0.011422153194553582,\n \"acc_norm\": 0.27640156453715775,\n\
\ \"acc_norm_stderr\": 0.011422153194553582\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3713235294117647,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.3713235294117647,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.32679738562091504,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.32679738562091504,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.27346938775510204,\n \"acc_stderr\": 0.02853556033712845,\n\
\ \"acc_norm\": 0.27346938775510204,\n \"acc_norm_stderr\": 0.02853556033712845\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.42786069651741293,\n\
\ \"acc_stderr\": 0.03498541988407795,\n \"acc_norm\": 0.42786069651741293,\n\
\ \"acc_norm_stderr\": 0.03498541988407795\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3674698795180723,\n\
\ \"acc_stderr\": 0.03753267402120574,\n \"acc_norm\": 0.3674698795180723,\n\
\ \"acc_norm_stderr\": 0.03753267402120574\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.015127427096520688,\n \"mc2\": 0.4094051732845386,\n\
\ \"mc2_stderr\": 0.014058890306038239\n }\n}\n```"
repo_url: https://huggingface.co/chargoddard/llama-2-34b-uncode
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|arc:challenge|25_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hellaswag|10_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T02:22:47.016201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T02:22:47.016201.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T02:22:47.016201.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T02:22:47.016201.parquet'
- config_name: results
data_files:
- split: 2023_08_29T02_22_47.016201
path:
- results_2023-08-29T02:22:47.016201.parquet
- split: latest
path:
- results_2023-08-29T02:22:47.016201.parquet
---
# Dataset Card for Evaluation run of chargoddard/llama-2-34b-uncode
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/chargoddard/llama-2-34b-uncode
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [chargoddard/llama-2-34b-uncode](https://huggingface.co/chargoddard/llama-2-34b-uncode) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_chargoddard__llama-2-34b-uncode",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-29T02:22:47.016201](https://huggingface.co/datasets/open-llm-leaderboard/details_chargoddard__llama-2-34b-uncode/blob/main/results_2023-08-29T02%3A22%3A47.016201.json):
```python
{
"all": {
"acc": 0.3830479829443808,
"acc_stderr": 0.03466690310283795,
"acc_norm": 0.3842963264119114,
"acc_norm_stderr": 0.03467326802571514,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520688,
"mc2": 0.4094051732845386,
"mc2_stderr": 0.014058890306038239
},
"harness|arc:challenge|25": {
"acc": 0.36860068259385664,
"acc_stderr": 0.014097810678042187,
"acc_norm": 0.39505119453924914,
"acc_norm_stderr": 0.014285898292938167
},
"harness|hellaswag|10": {
"acc": 0.29177454690300736,
"acc_stderr": 0.004536500714147978,
"acc_norm": 0.33897629954192393,
"acc_norm_stderr": 0.00472394354900599
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.27631578947368424,
"acc_stderr": 0.03639057569952925,
"acc_norm": 0.27631578947368424,
"acc_norm_stderr": 0.03639057569952925
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4188679245283019,
"acc_stderr": 0.030365050829115205,
"acc_norm": 0.4188679245283019,
"acc_norm_stderr": 0.030365050829115205
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.42196531791907516,
"acc_stderr": 0.03765746693865151,
"acc_norm": 0.42196531791907516,
"acc_norm_stderr": 0.03765746693865151
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281335,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281335
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.30344827586206896,
"acc_stderr": 0.038312260488503336,
"acc_norm": 0.30344827586206896,
"acc_norm_stderr": 0.038312260488503336
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403325,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403325
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.432258064516129,
"acc_stderr": 0.028181739720019406,
"acc_norm": 0.432258064516129,
"acc_norm_stderr": 0.028181739720019406
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3793103448275862,
"acc_stderr": 0.03413963805906235,
"acc_norm": 0.3793103448275862,
"acc_norm_stderr": 0.03413963805906235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03560716516531061,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03560716516531061
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5181347150259067,
"acc_stderr": 0.036060650018329185,
"acc_norm": 0.5181347150259067,
"acc_norm_stderr": 0.036060650018329185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.382051282051282,
"acc_stderr": 0.02463554916390823,
"acc_norm": 0.382051282051282,
"acc_norm_stderr": 0.02463554916390823
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.40756302521008403,
"acc_stderr": 0.03191863374478466,
"acc_norm": 0.40756302521008403,
"acc_norm_stderr": 0.03191863374478466
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.44036697247706424,
"acc_stderr": 0.021284310623761543,
"acc_norm": 0.44036697247706424,
"acc_norm_stderr": 0.021284310623761543
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437294,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437294
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.03213325717373617,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.03213325717373617
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.37130801687763715,
"acc_stderr": 0.03145068600744858,
"acc_norm": 0.37130801687763715,
"acc_norm_stderr": 0.03145068600744858
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928276,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.40458015267175573,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.40458015267175573,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831029,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831029
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.03205953453789293,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.03205953453789293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5491698595146871,
"acc_stderr": 0.017793297572699037,
"acc_norm": 0.5491698595146871,
"acc_norm_stderr": 0.017793297572699037
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3959537572254335,
"acc_stderr": 0.026329813341946243,
"acc_norm": 0.3959537572254335,
"acc_norm_stderr": 0.026329813341946243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925312,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925312
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.027956046165424502,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.027956046165424502
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4887459807073955,
"acc_stderr": 0.028390897396863526,
"acc_norm": 0.4887459807073955,
"acc_norm_stderr": 0.028390897396863526
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4228395061728395,
"acc_stderr": 0.027487472980871598,
"acc_norm": 0.4228395061728395,
"acc_norm_stderr": 0.027487472980871598
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.02612957252718085,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.02612957252718085
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27640156453715775,
"acc_stderr": 0.011422153194553582,
"acc_norm": 0.27640156453715775,
"acc_norm_stderr": 0.011422153194553582
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3713235294117647,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.3713235294117647,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.32679738562091504,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.32679738562091504,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.27346938775510204,
"acc_stderr": 0.02853556033712845,
"acc_norm": 0.27346938775510204,
"acc_norm_stderr": 0.02853556033712845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.42786069651741293,
"acc_stderr": 0.03498541988407795,
"acc_norm": 0.42786069651741293,
"acc_norm_stderr": 0.03498541988407795
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3674698795180723,
"acc_stderr": 0.03753267402120574,
"acc_norm": 0.3674698795180723,
"acc_norm_stderr": 0.03753267402120574
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520688,
"mc2": 0.4094051732845386,
"mc2_stderr": 0.014058890306038239
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Panacea1103/Target_Detection | 2023-09-22T14:42:58.000Z | [
"region:us"
] | Panacea1103 | null | null | null | 0 | 0 | Entry not found |
HarshSinyal/COI_allProvisions2019 | 2023-08-29T10:44:52.000Z | [
"region:us"
] | HarshSinyal | null | null | null | 0 | 0 |
# Constitution of India
Constitution of India in JSON Format
All Articles on the Indian Constitution in the following 3 Formats
1. CSV - Comma Seperated Values | [Constitution of India.csv](https://github.com/civictech-India/constitution-of-india/blob/main/Constitution%20of%20India.csv "Constitution of India.csv")
2. Db - SQLite Database File | [COI.db](https://github.com/civictech-India/constitution-of-india/blob/main/COI.db "COI.db")
3. JSON - Java Script Object Notation | [constitution_of_india.json](https://github.com/civictech-India/constitution-of-india/blob/main/constitution_of_india.json "constitution_of_india.json")
|
AMead10/climbing-images | 2023-08-29T03:00:44.000Z | [
"region:us"
] | AMead10 | null | null | null | 0 | 0 | Entry not found |
morimorimori/nva-satomi3 | 2023-08-29T02:51:35.000Z | [
"region:us"
] | morimorimori | null | null | null | 0 | 0 | Entry not found |
tyzhu/fwv2_random_num_train_1000_eval_100 | 2023-08-29T05:32:32.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 195871
num_examples: 2100
- name: train_doc2id
num_bytes: 92393
num_examples: 1100
- name: train_id2doc
num_bytes: 95693
num_examples: 1100
- name: train_find_word
num_bytes: 100178
num_examples: 1000
- name: eval_find_word
num_bytes: 10146
num_examples: 100
- name: id_context_mapping
num_bytes: 60493
num_examples: 1100
download_size: 0
dataset_size: 554774
---
# Dataset Card for "fwv2_random_num_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChanceFocus/flare-es-efp | 2023-08-29T03:10:28.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: 'query:'
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
- name: query
dtype: string
splits:
- name: test
num_bytes: 66200
num_examples: 37
download_size: 43563
dataset_size: 66200
---
# Dataset Card for "flare-es-efp"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChanceFocus/flare-es-efpa | 2023-08-29T03:12:27.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: 'query:'
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
- name: query
dtype: string
splits:
- name: test
num_bytes: 353055
num_examples: 228
download_size: 141839
dataset_size: 353055
---
# Dataset Card for "flare-es-efpa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zentheory/food-pairing-qa-sample | 2023-08-29T03:29:12.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | zentheory | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
---
|
kimnt93/OpenOrca-50k | 2023-08-29T03:28:36.000Z | [
"region:us"
] | kimnt93 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 85583064
num_examples: 50000
download_size: 49265986
dataset_size: 85583064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# OpenOrca-50k Dataset
## Description
OpenOrca-50k is a curated subset of the original Open-Orca dataset available on HuggingFace. This subset contains 50,000 random samples from the main dataset. It has been extracted to serve specific research purposes, especially for those requiring a smaller but representative portion of the original dataset.
Each entry in the dataset has the following structure:
- `id`: The unique identifier for the sample.
- `system_prompt`: System-generated prompt or context for the interaction.
- `question`: The main question posed, corresponding to the given prompt.
- `response`: The system's or model's response to the question.
## Source
The original dataset can be found [here](https://huggingface.co/datasets/Open-Orca/OpenOrca).
## Usage
This dataset is primarily tailored for researchers and machine learning practitioners who wish to work with a smaller version of the Open-Orca dataset. It is ideal for swift prototyping or in scenarios with limited computational resources.
To efficiently load the dataset using HuggingFace's datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("kimnt93/OpenOrca-50k")
```
## License
[Open-Orca](https://huggingface.co/datasets/Open-Orca/OpenOrca) |
Discipulo/EvoPenBib | 2023-08-29T03:26:22.000Z | [
"region:us"
] | Discipulo | null | null | null | 0 | 0 | Entry not found |
Seguidor/pensamiento-cristiano | 2023-08-29T03:42:00.000Z | [
"region:us"
] | Seguidor | null | null | null | 0 | 0 | Entry not found |
Roscall/LisaStans | 2023-08-29T03:40:08.000Z | [
"region:us"
] | Roscall | null | null | null | 0 | 0 | Entry not found |
tgj11/mhzs112 | 2023-09-05T08:52:25.000Z | [
"region:us"
] | tgj11 | null | null | null | 0 | 0 | Entry not found |
wisenut-nlp-team/korquad_v1.0_gqa_ab_context | 2023-08-29T03:49:06.000Z | [
"region:us"
] | wisenut-nlp-team | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
- name: context_a
dtype: string
- name: context_b
dtype: string
splits:
- name: train
num_bytes: 86477195
num_examples: 46621
- name: validation
num_bytes: 8491857
num_examples: 4405
download_size: 13204891
dataset_size: 94969052
---
# Dataset Card for "korquad_v1.0_gqa_ab_context"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
codingis4noobs2/Faqs | 2023-08-29T03:58:03.000Z | [
"region:us"
] | codingis4noobs2 | null | null | null | 0 | 0 | Entry not found |
xyliu-cs/hkcert_faq | 2023-08-29T04:00:53.000Z | [
"region:us"
] | xyliu-cs | null | null | null | 0 | 0 | Entry not found |
Awtryn/telecharger | 2023-08-29T04:53:11.000Z | [
"region:us"
] | Awtryn | null | null | null | 0 | 0 | Entry not found |
ChanceFocus/flare-es-tsa | 2023-08-29T04:09:20.000Z | [
"region:us"
] | ChanceFocus | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: query
dtype: string
- name: answer
dtype: string
- name: text
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: test
num_bytes: 1316673
num_examples: 3829
download_size: 483832
dataset_size: 1316673
---
# Dataset Card for "flare-es-tsa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ORGJI/ASDA | 2023-08-29T04:18:34.000Z | [
"license:bigscience-openrail-m",
"region:us"
] | ORGJI | null | null | null | 0 | 0 | ---
license: bigscience-openrail-m
---
|
leofto/sanshyne | 2023-08-29T04:55:09.000Z | [
"region:us"
] | leofto | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_bongchoi__test-llama2-7b | 2023-09-16T19:36:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of bongchoi/test-llama2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bongchoi/test-llama2-7b](https://huggingface.co/bongchoi/test-llama2-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bongchoi__test-llama2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T19:36:12.019633](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama2-7b/blob/main/results_2023-09-16T19-36-12.019633.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196461104,\n \"f1\": 0.05606543624161075,\n\
\ \"f1_stderr\": 0.0013211107078874738,\n \"acc\": 0.4057988012013119,\n\
\ \"acc_stderr\": 0.00970458141675358\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196461104,\n\
\ \"f1\": 0.05606543624161075,\n \"f1_stderr\": 0.0013211107078874738\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \
\ \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bongchoi/test-llama2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|arc:challenge|25_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T19_36_12.019633
path:
- '**/details_harness|drop|3_2023-09-16T19-36-12.019633.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T19-36-12.019633.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T19_36_12.019633
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-36-12.019633.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-36-12.019633.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hellaswag|10_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T19_36_12.019633
path:
- '**/details_harness|winogrande|5_2023-09-16T19-36-12.019633.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T19-36-12.019633.parquet'
- config_name: results
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- results_2023-08-29T04:25:39.762695.parquet
- split: 2023_09_16T19_36_12.019633
path:
- results_2023-09-16T19-36-12.019633.parquet
- split: latest
path:
- results_2023-09-16T19-36-12.019633.parquet
---
# Dataset Card for Evaluation run of bongchoi/test-llama2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bongchoi/test-llama2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bongchoi/test-llama2-7b](https://huggingface.co/bongchoi/test-llama2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bongchoi__test-llama2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T19:36:12.019633](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama2-7b/blob/main/results_2023-09-16T19-36-12.019633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461104,
"f1": 0.05606543624161075,
"f1_stderr": 0.0013211107078874738,
"acc": 0.4057988012013119,
"acc_stderr": 0.00970458141675358
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461104,
"f1": 0.05606543624161075,
"f1_stderr": 0.0013211107078874738
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kimsiun/CReSE | 2023-08-29T05:31:04.000Z | [
"license:openrail",
"region:us"
] | kimsiun | null | null | null | 0 | 0 | ---
license: openrail
---
|
TaylorAI/RLCD-generated-preference-data | 2023-08-29T05:19:38.000Z | [
"region:us"
] | TaylorAI | null | null | null | 1 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: float64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: preference
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 149793678
num_examples: 167999
download_size: 87743717
dataset_size: 149793678
---
# Dataset Card for "RLCD-generated-preference-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_random_num_train_100_eval_100 | 2023-08-29T05:32:18.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 27122
num_examples: 300
- name: train_doc2id
num_bytes: 16692
num_examples: 200
- name: train_id2doc
num_bytes: 17292
num_examples: 200
- name: train_find_word
num_bytes: 9830
num_examples: 100
- name: eval_find_word
num_bytes: 9946
num_examples: 100
- name: id_context_mapping
num_bytes: 10892
num_examples: 200
download_size: 52332
dataset_size: 91774
---
# Dataset Card for "fwv2_random_num_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_random_num_train_10000_eval_100 | 2023-08-29T05:33:00.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1909652
num_examples: 20100
- name: train_doc2id
num_bytes: 857494
num_examples: 10100
- name: train_id2doc
num_bytes: 887794
num_examples: 10100
- name: train_find_word
num_bytes: 1021858
num_examples: 10000
- name: eval_find_word
num_bytes: 10346
num_examples: 100
- name: id_context_mapping
num_bytes: 564594
num_examples: 10100
download_size: 2074803
dataset_size: 5251738
---
# Dataset Card for "fwv2_random_num_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lersmooth/sd15models | 2023-08-29T05:33:10.000Z | [
"region:us"
] | lersmooth | null | null | null | 0 | 0 | Entry not found |
lersmooth/sd15-controlnet-models | 2023-08-29T05:34:13.000Z | [
"region:us"
] | lersmooth | null | null | null | 0 | 0 | Entry not found |
sevenwonder617/pysc | 2023-08-29T05:46:40.000Z | [
"license:apache-2.0",
"region:us"
] | sevenwonder617 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
yeonsikc/conist40k-sr | 2023-08-29T07:34:26.000Z | [
"region:us"
] | yeonsikc | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
- name: caption_text
dtype: string
- name: condition_noText
dtype: bool
- name: condition_person
dtype: bool
splits:
- name: train
num_bytes: 89783617413.555
num_examples: 38985
download_size: 27501426883
dataset_size: 89783617413.555
---
# Dataset Card for "conist40k-sr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_squad_num_train_100_eval_100 | 2023-08-29T08:05:01.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 45785
num_examples: 300
- name: train_doc2id
num_bytes: 34449
num_examples: 200
- name: train_id2doc
num_bytes: 35049
num_examples: 200
- name: train_find_word
num_bytes: 10736
num_examples: 100
- name: eval_find_word
num_bytes: 10344
num_examples: 100
- name: id_context_mapping
num_bytes: 28649
num_examples: 200
download_size: 104070
dataset_size: 165012
---
# Dataset Card for "fwv2_squad_num_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
genaibook/images | 2023-09-03T19:50:02.000Z | [
"license:mit",
"region:us"
] | genaibook | null | null | null | 0 | 0 | ---
license: mit
---
|
tyzhu/fwv2_squad_num_train_1000_eval_100 | 2023-08-29T08:05:34.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 300908
num_examples: 2100
- name: train_doc2id
num_bytes: 188562
num_examples: 1100
- name: train_id2doc
num_bytes: 191862
num_examples: 1100
- name: train_find_word
num_bytes: 109046
num_examples: 1000
- name: eval_find_word
num_bytes: 10620
num_examples: 100
- name: id_context_mapping
num_bytes: 156662
num_examples: 1100
download_size: 513271
dataset_size: 957660
---
# Dataset Card for "fwv2_squad_num_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ethanweber/Mip-NeRF_360_Processed_with_Nerfstudio | 2023-08-29T07:02:22.000Z | [
"region:us"
] | ethanweber | null | null | null | 0 | 0 | Entry not found |
sevenwonder617/pys | 2023-08-29T06:04:09.000Z | [
"license:apache-2.0",
"region:us"
] | sevenwonder617 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
TaylorAI/RLCD-SFT-dataset | 2023-08-29T06:11:58.000Z | [
"region:us"
] | TaylorAI | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: instruction
dtype: string
- name: preferred_output
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 121911383
num_examples: 167999
download_size: 71387145
dataset_size: 121911383
---
# Dataset Card for "RLCD-SFT-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lightphenexx/DNABert-test | 2023-08-29T06:10:44.000Z | [
"region:us"
] | lightphenexx | null | null | null | 0 | 0 | Entry not found |
Wangyesong/testdata | 2023-08-29T06:14:48.000Z | [
"license:apache-2.0",
"region:us"
] | Wangyesong | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
LahiruLowe/cot_filtered_3pertask | 2023-08-29T06:32:21.000Z | [
"region:us"
] | LahiruLowe | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: original_index
dtype: int64
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 22427
num_examples: 54
download_size: 15007
dataset_size: 22427
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "cot_filtered_3pertask"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fwv2_random_num_tip_train_100_eval_100 | 2023-08-29T06:43:08.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 39706
num_examples: 300
- name: train_doc2id
num_bytes: 16692
num_examples: 200
- name: train_id2doc
num_bytes: 17292
num_examples: 200
- name: train_find_word
num_bytes: 22414
num_examples: 100
- name: eval_find_word
num_bytes: 16346
num_examples: 100
- name: id_context_mapping
num_bytes: 10892
num_examples: 200
download_size: 41369
dataset_size: 123342
---
# Dataset Card for "fwv2_random_num_tip_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/500correct | 2023-08-29T06:43:47.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
vivekraina/stanford_dataset_qa_final | 2023-08-29T06:46:06.000Z | [
"region:us"
] | vivekraina | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: paragraphs
list:
- name: context
dtype: string
- name: qas
list:
- name: answers
list:
- name: answer_start
dtype: int64
- name: text
dtype: string
- name: id
dtype: string
- name: question
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3745671
num_examples: 48
download_size: 1775277
dataset_size: 3745671
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "stanford_dataset_qa_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harish-blu/blutag-cc | 2023-08-29T08:31:21.000Z | [
"region:us"
] | harish-blu | Demo dataset for testing or showing image-text capabilities. | @InProceedings{huggingface:dataset,
title = {Small image-text set},
author={James Briggs},
year={2022}
} | null | 0 | 0 | Entry not found |
tyzhu/fwv2_random_num_tip_train_10_eval_10 | 2023-08-29T07:02:31.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
- split: id_context_mapping
path: data/id_context_mapping-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 3919
num_examples: 30
- name: train_doc2id
num_bytes: 1651
num_examples: 20
- name: train_id2doc
num_bytes: 1711
num_examples: 20
- name: train_find_word
num_bytes: 2208
num_examples: 10
- name: eval_find_word
num_bytes: 1604
num_examples: 10
- name: id_context_mapping
num_bytes: 1071
num_examples: 20
download_size: 19912
dataset_size: 12164
---
# Dataset Card for "fwv2_random_num_tip_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/10kEqnsGPT4 | 2023-08-29T07:07:17.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.