id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
text2font/words_with_path_tags_version_2_valid | 2023-10-10T20:31:57.000Z | [
"region:us"
] | text2font | null | null | null | 0 | 0 | Entry not found |
text2font/words_with_path_tags_version_2_test | 2023-10-10T15:30:45.000Z | [
"region:us"
] | text2font | null | null | null | 0 | 0 | Entry not found |
nalmeida/test_local2 | 2023-10-10T15:40:10.000Z | [
"region:us"
] | nalmeida | null | null | null | 0 | 0 | Entry not found |
nlewins/fleurs_ceb_to_en | 2023-10-10T15:44:08.000Z | [
"region:us"
] | nlewins | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: int32
- name: transcription
dtype: string
- name: language
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription_en
dtype: string
- name: audio_en
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 4837972886.028
num_examples: 3188
- name: validation
num_bytes: 332770769.0
num_examples: 225
- name: test
num_bytes: 834809869.0
num_examples: 541
download_size: 5885482902
dataset_size: 6005553524.028
---
# Dataset Card for "fleurs_ceb_to_en_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test | 2023-10-10T15:41:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Lazycuber/L2-7b-Orca-WVG-Test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Lazycuber/L2-7b-Orca-WVG-Test](https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T15:39:37.735727](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test/blob/main/results_2023-10-10T15-39-37.735727.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.512701362043529,\n\
\ \"acc_stderr\": 0.03493895100033154,\n \"acc_norm\": 0.5164965980804035,\n\
\ \"acc_norm_stderr\": 0.03492452583764037,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.436784676933468,\n\
\ \"mc2_stderr\": 0.014891030280754473\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5162116040955631,\n \"acc_stderr\": 0.014603708567414938,\n\
\ \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.014542104569955267\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5910177255526787,\n\
\ \"acc_stderr\": 0.00490641198447679,\n \"acc_norm\": 0.782513443537144,\n\
\ \"acc_norm_stderr\": 0.004116931383157353\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5259259259259259,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.5259259259259259,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.041711158581816184,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.041711158581816184\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.043391383225798615,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.043391383225798615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.039701582732351734,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.039701582732351734\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5387096774193548,\n \"acc_stderr\": 0.028358634859836935,\n \"\
acc_norm\": 0.5387096774193548,\n \"acc_norm_stderr\": 0.028358634859836935\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3842364532019704,\n \"acc_stderr\": 0.034223985656575494,\n \"\
acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.034223985656575494\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6616161616161617,\n \"acc_stderr\": 0.033711241426263014,\n \"\
acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.033711241426263014\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7098445595854922,\n \"acc_stderr\": 0.032752644677915166,\n\
\ \"acc_norm\": 0.7098445595854922,\n \"acc_norm_stderr\": 0.032752644677915166\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846482,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846482\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47478991596638653,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.47478991596638653,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526733,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526733\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.708256880733945,\n \"acc_stderr\": 0.01948930096887651,\n \"acc_norm\"\
: 0.708256880733945,\n \"acc_norm_stderr\": 0.01948930096887651\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n\
\ \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.046166311118017146,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.046166311118017146\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112722,\n\
\ \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112722\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7435897435897436,\n\
\ \"acc_stderr\": 0.028605953702004257,\n \"acc_norm\": 0.7435897435897436,\n\
\ \"acc_norm_stderr\": 0.028605953702004257\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7164750957854407,\n\
\ \"acc_stderr\": 0.016117318166832265,\n \"acc_norm\": 0.7164750957854407,\n\
\ \"acc_norm_stderr\": 0.016117318166832265\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.026720034380514995,\n\
\ \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.026720034380514995\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2837988826815642,\n\
\ \"acc_stderr\": 0.015078358970751752,\n \"acc_norm\": 0.2837988826815642,\n\
\ \"acc_norm_stderr\": 0.015078358970751752\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5326797385620915,\n \"acc_stderr\": 0.02856869975222587,\n\
\ \"acc_norm\": 0.5326797385620915,\n \"acc_norm_stderr\": 0.02856869975222587\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668777,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668777\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.379400260756193,\n\
\ \"acc_stderr\": 0.012393202029825402,\n \"acc_norm\": 0.379400260756193,\n\
\ \"acc_norm_stderr\": 0.012393202029825402\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6268656716417911,\n\
\ \"acc_stderr\": 0.03419832608176007,\n \"acc_norm\": 0.6268656716417911,\n\
\ \"acc_norm_stderr\": 0.03419832608176007\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.436784676933468,\n\
\ \"mc2_stderr\": 0.014891030280754473\n }\n}\n```"
repo_url: https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-39-37.735727.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-39-37.735727.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-39-37.735727.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_39_37.735727
path:
- results_2023-10-10T15-39-37.735727.parquet
- split: latest
path:
- results_2023-10-10T15-39-37.735727.parquet
---
# Dataset Card for Evaluation run of Lazycuber/L2-7b-Orca-WVG-Test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Orca-WVG-Test](https://huggingface.co/Lazycuber/L2-7b-Orca-WVG-Test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T15:39:37.735727](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Orca-WVG-Test/blob/main/results_2023-10-10T15-39-37.735727.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.512701362043529,
"acc_stderr": 0.03493895100033154,
"acc_norm": 0.5164965980804035,
"acc_norm_stderr": 0.03492452583764037,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.436784676933468,
"mc2_stderr": 0.014891030280754473
},
"harness|arc:challenge|25": {
"acc": 0.5162116040955631,
"acc_stderr": 0.014603708567414938,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.014542104569955267
},
"harness|hellaswag|10": {
"acc": 0.5910177255526787,
"acc_stderr": 0.00490641198447679,
"acc_norm": 0.782513443537144,
"acc_norm_stderr": 0.004116931383157353
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5259259259259259,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.5259259259259259,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.041711158581816184,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.041711158581816184
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.043391383225798615,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.043391383225798615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.039701582732351734,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.039701582732351734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5387096774193548,
"acc_stderr": 0.028358634859836935,
"acc_norm": 0.5387096774193548,
"acc_norm_stderr": 0.028358634859836935
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.034223985656575494,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.034223985656575494
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.033711241426263014,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.033711241426263014
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7098445595854922,
"acc_stderr": 0.032752644677915166,
"acc_norm": 0.7098445595854922,
"acc_norm_stderr": 0.032752644677915166
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846482,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846482
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47478991596638653,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.47478991596638653,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526733,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526733
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.708256880733945,
"acc_stderr": 0.01948930096887651,
"acc_norm": 0.708256880733945,
"acc_norm_stderr": 0.01948930096887651
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017146,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017146
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112722,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7435897435897436,
"acc_stderr": 0.028605953702004257,
"acc_norm": 0.7435897435897436,
"acc_norm_stderr": 0.028605953702004257
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7164750957854407,
"acc_stderr": 0.016117318166832265,
"acc_norm": 0.7164750957854407,
"acc_norm_stderr": 0.016117318166832265
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.026720034380514995,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.026720034380514995
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2837988826815642,
"acc_stderr": 0.015078358970751752,
"acc_norm": 0.2837988826815642,
"acc_norm_stderr": 0.015078358970751752
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5326797385620915,
"acc_stderr": 0.02856869975222587,
"acc_norm": 0.5326797385620915,
"acc_norm_stderr": 0.02856869975222587
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668777,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668777
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.379400260756193,
"acc_stderr": 0.012393202029825402,
"acc_norm": 0.379400260756193,
"acc_norm_stderr": 0.012393202029825402
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6268656716417911,
"acc_stderr": 0.03419832608176007,
"acc_norm": 0.6268656716417911,
"acc_norm_stderr": 0.03419832608176007
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.436784676933468,
"mc2_stderr": 0.014891030280754473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PHNG/chatmed-cai-en | 2023-10-10T15:47:13.000Z | [
"region:us"
] | PHNG | null | null | null | 0 | 0 | |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa | 2023-10-10T15:51:08.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-LoRa
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-LoRa](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T15:49:43.201517](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa/blob/main/results_2023-10-10T15-49-43.201517.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5216421138363554,\n\
\ \"acc_stderr\": 0.034984720641748575,\n \"acc_norm\": 0.5252295168080844,\n\
\ \"acc_norm_stderr\": 0.03497398634134131,\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.5938354841447588,\n\
\ \"mc2_stderr\": 0.015090386269121684\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.014611390804670088,\n \
\ \"acc_norm\": 0.5307167235494881,\n \"acc_norm_stderr\": 0.014583792546304037\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5656243776140211,\n\
\ \"acc_stderr\": 0.004946617138983521,\n \"acc_norm\": 0.7465644293965346,\n\
\ \"acc_norm_stderr\": 0.004340891673320502\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4868421052631579,\n \"acc_stderr\": 0.04067533136309174,\n\
\ \"acc_norm\": 0.4868421052631579,\n \"acc_norm_stderr\": 0.04067533136309174\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929777,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n\
\ \"acc_stderr\": 0.028040981380761547,\n \"acc_norm\": 0.5838709677419355,\n\
\ \"acc_norm_stderr\": 0.028040981380761547\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.033442837442804574,\n\
\ \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.033442837442804574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.036639749943912434,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.036639749943912434\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4641025641025641,\n \"acc_stderr\": 0.02528558599001784,\n \
\ \"acc_norm\": 0.4641025641025641,\n \"acc_norm_stderr\": 0.02528558599001784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115007,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.710091743119266,\n \"acc_stderr\": 0.0194530666092016,\n \"acc_norm\"\
: 0.710091743119266,\n \"acc_norm_stderr\": 0.0194530666092016\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236435,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236435\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105307,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968432,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968432\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6134969325153374,\n \"acc_stderr\": 0.03825825548848607,\n\
\ \"acc_norm\": 0.6134969325153374,\n \"acc_norm_stderr\": 0.03825825548848607\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.811965811965812,\n\
\ \"acc_stderr\": 0.02559819368665226,\n \"acc_norm\": 0.811965811965812,\n\
\ \"acc_norm_stderr\": 0.02559819368665226\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598642,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598642\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952236,\n\
\ \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952236\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2759776536312849,\n\
\ \"acc_stderr\": 0.014950103002475349,\n \"acc_norm\": 0.2759776536312849,\n\
\ \"acc_norm_stderr\": 0.014950103002475349\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5620915032679739,\n \"acc_stderr\": 0.028408302020332694,\n\
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.028408302020332694\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402616,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402616\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38852672750977835,\n\
\ \"acc_stderr\": 0.012448817838292355,\n \"acc_norm\": 0.38852672750977835,\n\
\ \"acc_norm_stderr\": 0.012448817838292355\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213542,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213542\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.511437908496732,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.511437908496732,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.04653429807913507,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.04653429807913507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.03106721126287248,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.03106721126287248\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355044,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.43373493975903615,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.412484700122399,\n\
\ \"mc1_stderr\": 0.017233299399571227,\n \"mc2\": 0.5938354841447588,\n\
\ \"mc2_stderr\": 0.015090386269121684\n }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-49-43.201517.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-49-43.201517.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-49-43.201517.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_49_43.201517
path:
- results_2023-10-10T15-49-43.201517.parquet
- split: latest
path:
- results_2023-10-10T15-49-43.201517.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-LoRa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-LoRa](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-LoRa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T15:49:43.201517](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-LoRa/blob/main/results_2023-10-10T15-49-43.201517.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5216421138363554,
"acc_stderr": 0.034984720641748575,
"acc_norm": 0.5252295168080844,
"acc_norm_stderr": 0.03497398634134131,
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.5938354841447588,
"mc2_stderr": 0.015090386269121684
},
"harness|arc:challenge|25": {
"acc": 0.5,
"acc_stderr": 0.014611390804670088,
"acc_norm": 0.5307167235494881,
"acc_norm_stderr": 0.014583792546304037
},
"harness|hellaswag|10": {
"acc": 0.5656243776140211,
"acc_stderr": 0.004946617138983521,
"acc_norm": 0.7465644293965346,
"acc_norm_stderr": 0.004340891673320502
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4868421052631579,
"acc_stderr": 0.04067533136309174,
"acc_norm": 0.4868421052631579,
"acc_norm_stderr": 0.04067533136309174
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929777,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5838709677419355,
"acc_stderr": 0.028040981380761547,
"acc_norm": 0.5838709677419355,
"acc_norm_stderr": 0.028040981380761547
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.033442837442804574,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.033442837442804574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.036639749943912434,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.036639749943912434
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4641025641025641,
"acc_stderr": 0.02528558599001784,
"acc_norm": 0.4641025641025641,
"acc_norm_stderr": 0.02528558599001784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115007,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.710091743119266,
"acc_stderr": 0.0194530666092016,
"acc_norm": 0.710091743119266,
"acc_norm_stderr": 0.0194530666092016
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236435,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236435
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105307,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968432,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968432
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6134969325153374,
"acc_stderr": 0.03825825548848607,
"acc_norm": 0.6134969325153374,
"acc_norm_stderr": 0.03825825548848607
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.811965811965812,
"acc_stderr": 0.02559819368665226,
"acc_norm": 0.811965811965812,
"acc_norm_stderr": 0.02559819368665226
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598642,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952236,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952236
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2759776536312849,
"acc_stderr": 0.014950103002475349,
"acc_norm": 0.2759776536312849,
"acc_norm_stderr": 0.014950103002475349
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.028408302020332694,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.028408302020332694
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402616,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402616
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38852672750977835,
"acc_stderr": 0.012448817838292355,
"acc_norm": 0.38852672750977835,
"acc_norm_stderr": 0.012448817838292355
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213542,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213542
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.511437908496732,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.511437908496732,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.04653429807913507,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.04653429807913507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.03106721126287248,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.03106721126287248
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355044,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.412484700122399,
"mc1_stderr": 0.017233299399571227,
"mc2": 0.5938354841447588,
"mc2_stderr": 0.015090386269121684
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
borggAI/bittensor-prompting-10102023 | 2023-10-10T15:57:45.000Z | [
"region:us"
] | borggAI | null | null | null | 0 | 0 | Entry not found |
Mouli07/ROCO_Chest_Xray_v1 | 2023-10-10T17:12:44.000Z | [
"region:us"
] | Mouli07 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 4079190365.452
num_examples: 140612
- name: validation
num_bytes: 493342548.63
num_examples: 17570
- name: test
num_bytes: 499414242.052
num_examples: 17572
download_size: 4812974957
dataset_size: 5071947156.134
---
# Dataset Card for "ROCO_Chest_Xray_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nalmeida/test2 | 2023-10-10T16:03:36.000Z | [
"region:us"
] | nalmeida | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: pregunta
dtype: string
- name: respuesta
dtype: string
splits:
- name: train
num_bytes: 498
num_examples: 5
download_size: 2175
dataset_size: 498
---
# Dataset Card for "test2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nalmeida/test3 | 2023-10-10T16:05:22.000Z | [
"region:us"
] | nalmeida | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 487027
num_examples: 321
download_size: 146233
dataset_size: 487027
---
# Dataset Card for "test3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zuko2/conditional-translation-me-en-me | 2023-10-10T16:18:57.000Z | [
"license:mit",
"region:us"
] | zuko2 | null | null | null | 0 | 0 | ---
license: mit
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: src
dtype: string
- name: tgt
dtype: string
splits:
- name: train
num_bytes: 23020866
num_examples: 112170
- name: valid
num_bytes: 105436
num_examples: 1000
- name: test
num_bytes: 52976
num_examples: 500
download_size: 12300184
dataset_size: 23179278
---
|
reach-vb/random-audios | 2023-10-10T16:09:29.000Z | [
"region:us"
] | reach-vb | null | null | null | 0 | 0 | Entry not found |
nalmeida/securitas_300 | 2023-10-10T16:26:05.000Z | [
"region:us"
] | nalmeida | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 485743
num_examples: 321
download_size: 145607
dataset_size: 485743
---
# Dataset Card for "securitas_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
MrB141107/MrB141107 | 2023-10-10T16:13:39.000Z | [
"region:us"
] | MrB141107 | null | null | null | 0 | 0 | Entry not found |
marianna13/litarch | 2023-10-10T17:46:42.000Z | [
"language:en",
"license:cc",
"region:us"
] | marianna13 | null | null | null | 0 | 0 | ---
license: cc
language:
- en
---
Textbooks from [PubChem Literature Archive](https://ftp.ncbi.nlm.nih.gov/pub/litarch/).
# Image-Text Pairs
```
[
['litarch_figures/ca/84/gene_NBK1116/angelmanF1.jpg', '\nIndividuals depicted have a genetically confirmed diagnosis of Angelman syndrome. Happy expression and an unstable gait accompanied by uplifted arms are commonly observed. At times, the facial appearance can suggest the diagnosis, but usually facial features are not distinctive.\n', ''],
['litarch_figures/ca/84/gene_NBK1116/angelmanF2.jpg', '\nSchematic drawing of chromosome region 15q11.2-q13 indicating the breakpoint regions BP1-BP6. Low copy repeat elements are located within these breakpoint regions (see text for details). Approximately 90% of chromosome deletions resulting in Angelman syndrome initiate at BP1 or BP2 and terminate in region BP3 (class I and class II). Approximately 10% of deletions are larger, typically spanning from BP1 to BP5, rarely beyond BP5. Genes that are not imprinted and thus biparentally expressed are noted by the open circles. The two critical imprinting center (IC) elements, the AS-SRO and the PWS-SRO, are drawn as open boxes. The gene SNRUF-SNRPN, drawn as a shaded box, has some overlap with the PWS-SRO. The SNURF-SNRPN sense/UBE3A antisense transcript is labeled UBE3A-AS.\n', ''],
['litarch_figures/ca/84/gene_NBK1116/angelmanF3.jpg', '\nThe pedigree illustrates imprinting inheritance in Angelman syndrome (AS). Inheritance of a deleterious UBE3A pathogenic variant from the male (top left, I-1) has no effect on the two children (II-2, II-4) who inherit his pathogenic variant because the mutated UBE3A has already been inactivated in his germ cells (i.e., by imprinting) and because each of these children also inherited a normally activated UBE3A from their mother (I-2). (Note: Only one active UBE3A allele is required for normal brain functioning.) If his carrier daughter (II-2) transmits the UBE3A pathogenic variant to the grandson and granddaughter (III-1, III-2), they both will have AS since each will have also inherited an inactivated UBE3A from their father; thus, neither child will express a UBE3A allele. The same explanation pertains for AS occurring in the great grand-niece (bottom right, IV-2).\n', '']
]
```
# Interleaved:
```
[
["Getting by with the bare minimum seems to be the modus operandi of Mycobacterium leprae \u2014 the causal agent of leprosy. Its genome sequence reveals that it has undergone massive genome 'downsizing' over time, discarding more than half its genes and rendering it the most striking example of genome reduction in a microbial pathogen."],
["The leprosy bacillus is famed for being the first microorganism definitively shown to be associated with human disease. It evades the host's immune response by invading and propagating inside the vacuoles of macrophages called phagosomes. From there, it infects the Schwann cells of the peripheral nervous system, where it disrupts myelin production, thus leading to the characteristic features of leprosy, which include skin lesions and sensory loss."],
["litarch_figures/df/45/coffeebrk_NBK2345/A559.jpg",
"\nProtein coding genes distribution map for Mycobacterium leprae.\nThe leprosy bacillus genome contains numerous examples of gene deletion and decay. The relative locations of various genes in the genome are depicted in the map above. Protein coding genes are color coded in the map according to their classification within clusters of orthologous groups (COGs) functional categories. COGs represent proteins or groups of paralogs that are found in at least 3 phylogenetically-distant genomes. For more information about COGs, see Science 1997 Oct 24:278(5338):631-7.\n\n",
""],
["Protein coding genes distribution map for Mycobacterium leprae."]
]
```
# Text
```
"Getting by with the bare minimum seems to be the modus operandi of Mycobacterium leprae \u2014 the causal agent of leprosy. Its genome sequence reveals that it has undergone massive genome 'downsizing' over time, discarding more than half its genes and rendering it the most striking example of genome reduction in a microbial pathogen.\nThe leprosy bacillus is famed for being the first microorganism definitively shown to be associated with human disease. It evades the host's immune response by invading and propagating inside the vacuoles of macrophages called phagosomes. From there, it infects the Schwann cells of the peripheral nervous system, where it disrupts myelin production, thus leading to the characteristic features of leprosy, which include skin lesions and sensory loss... "
``` |
davanstrien/test_card | 2023-10-10T17:15:54.000Z | [
"region:us"
] | davanstrien | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: lastModified
dtype: string
- name: tags
sequence: string
- name: author
dtype: string
- name: description
dtype: string
- name: citation
dtype: string
- name: cardData
dtype: 'null'
- name: likes
dtype: int64
- name: downloads
dtype: int64
- name: card
dtype: string
splits:
- name: train
num_bytes: 202491963
num_examples: 69268
download_size: 52652680
dataset_size: 202491963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "test_card"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Diesertikel/nils | 2023-10-10T16:31:40.000Z | [
"region:us"
] | Diesertikel | null | null | null | 0 | 0 | Entry not found |
nymiz/costa-rica-autolabel | 2023-10-10T17:06:33.000Z | [
"region:us"
] | nymiz | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: doc_name
dtype: string
- name: tokens
sequence: string
- name: ner_tags
sequence:
class_label:
names:
'0': B-Currency
'1': I-Date
'2': I-Clase de asunto
'3': B-Public-organization
'4': B-Cédula-identidad
'5': I-Undefined-entities
'6': I-Public-organization
'7': I-Location
'8': B-Fax
'9': B-Clase de asunto
'10': I-Cédula-identidad
'11': O
'12': B-Location
'13': B-Nationality
'14': B-Acronym
'15': B-Person
'16': B-Undefined-entities
'17': I-Private-organization
'18': B-Date
'19': B-Phone
'20': I-Person
'21': B-Private-organization
'22': I-Currency
'23': I-Acronym
- name: iob_tags
sequence: string
splits:
- name: train
num_bytes: 9532812
num_examples: 9557
download_size: 1328744
dataset_size: 9532812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "costa-rica-autolabel"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deadbits/vigil-gandalf-instruction-bypass-ada-002 | 2023-10-10T19:07:03.000Z | [
"embeddings",
"text",
"security",
"region:us"
] | deadbits | null | null | null | 0 | 0 | ---
tags:
- embeddings
- text
- security
pretty_name: 'Vigil: LLM Gandalf Instruction Bypass text-embedding-ada-002'
---
# Vigil: LLM Gandalf Instruction Bypass text-embedding-ada-002
- **Repo:** [github.com/deadbits/vigil-llm](https://github.com/deadbits/vigil-llm)
`Vigil` is a Python framework and REST API for assessing Large Language Model (LLM) prompts against a set of scanners to detect prompt injections, jailbreaks, and other potentially risky inputs.
This repository contains `text-embedding-ada-002` embeddings for the [Lakera Gandalf "Ignore Instructions" dataset](https://huggingface.co/datasets/Lakera/gandalf_ignore_instructions).
All prompts from the original dataset have been lowercased before embedding.
You can use the [parquet2vdb.py](https://github.com/deadbits/prompt-injection-defense/blob/main/vigil/utils/parquet2vdb.py) utility to load the embeddings in the Vigil chromadb instance, or use them in your own application.
## Format
```json
[
{
"text": str,
"embedding": [],
"model": "text-embedding-ada-002"
}
]
```
**Original dataset:** https://huggingface.co/datasets/Lakera/gandalf_ignore_instructions
```
@InProceedings{gandalf_ignore_instructions,
title = {gandalf_ignore_instructions},
author={Lakera AI (https://www.lakera.ai)},
year={2023}
}
``` |
lchakkei/OpenOrca-Traditional-Chinese-Text | 2023-10-10T16:41:01.000Z | [
"region:us"
] | lchakkei | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 6776409430
num_examples: 4233915
download_size: 3983423866
dataset_size: 6776409430
---
# Dataset Card for "OpenOrca-Traditional-Chinese-Text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-acronym_identification-default-35e977-94268146033 | 2023-10-10T16:50:56.000Z | [
"region:us"
] | autoevaluate | null | null | null | 0 | 0 | Entry not found |
zeio/baneks-distinct | 2023-10-10T17:08:54.000Z | [
"task_categories:text-generation",
"language_creators:crowdsourced",
"size_categories:10K<n<100K",
"language:ru",
"language:en",
"license:apache-2.0",
"not-for-all-audiences",
"art",
"humour",
"jokes",
"region:us"
] | zeio | null | null | null | 0 | 0 | ---
language:
- ru
- en
license: apache-2.0
tags:
- not-for-all-audiences
- art
- humour
- jokes
annotation_creators:
- crowdsourced
language_creators:
- crowdsourced
pretty_name: baneks-distinct
size_categories:
- 10K<n<100K
task_categories:
- text-generation
---
# Dataset card for baneks-distinct
## Table of contents
- [Dataset description](#dataset-description)
- [Dataset summary](#dataset-summary)
- [Dataset structure](#dataset-structure)
- [Dataset instance](#dataset-instance)
- [Dataset fields](#dataset-fields)
## Dataset description
- **Homepage:** [baneks-distinct homepage]()
- **Repository:** [baneks-distinct repository](https://huggingface.co/datasets/zeio/baneks-distinct)
- **Point of contact:** [Zeio Nara](mailto:zeionara@gmail.com)
- **Dataset version:** `10.10.2023`
### Dataset summary
This dataset contains anekdotes parsed from a few vk social network communities. Since the dataset is regularly updated, there is no fixed number of entries, so stay tuned.
This is a version of the [base baneks dataset](https://huggingface.co/datasets/zeio/baneks), which doesn't contain entries with duplicated text.
## Dataset structure
### Data instance
An example of an entry from the dataset is given below:
```json
{
"text": "- Папа, а кто такие алкоголики? - Ну, сынок.. Вот, видишь - четыре гендера стоят? А алкоголику кажется, что там восемь гендеров - Пап, там два гендера.",
"published": "16-09-2023 01:38",
"id": 497393,
"n-likes": 13,
"n-views": 804,
"accessed": "16-09-2023 01:51",
"source": "anekdotikategoriib"
}
```
### Data fields
Each dataset entry therefore consists of the following fields:
- `text` - text representation of the anecdote;
- `published` - publication date of the corresponding post in the format `DD-MM-YYYY hh:mm`;
- `id` - id of the corresponding post;
- `n-likes` - number of likes received by the corresponding post up to the access date;
- `n-views` - number of views received by the corresponding post up to the access date;
- `accessed`- access date of the corresponding post in the format `DD-MM-YYYY hh:mm`;
- `source` - community name in which the corresponding post has been published.
|
gaodrew/roman_empire_qa_27k | 2023-10-10T17:12:28.000Z | [
"license:mit",
"region:us"
] | gaodrew | null | null | null | 0 | 0 | ---
license: mit
---
roman_empire_qa_27k is a prompt-completion pairs dataset of 27,300 questions and answers about the Roman Empire.
Also provided are context snippets from which the questions and answers were generated (by GPT-3.5-turbo).
|
ALBADDAWI/data1 | 2023-10-10T17:21:42.000Z | [
"region:us"
] | ALBADDAWI | null | null | null | 0 | 0 | Entry not found |
lucas-meyer/asr_af | 2023-10-10T17:08:46.000Z | [
"region:us"
] | lucas-meyer | null | null | null | 0 | 0 | Entry not found |
Harshithacj123/CCU_Midterm | 2023-10-10T17:08:47.000Z | [
"region:us"
] | Harshithacj123 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 41353
num_examples: 50
download_size: 23370
dataset_size: 41353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "CCU_Midterm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umarigan/turkish_wikipedia_dataset_NER | 2023-10-10T17:12:30.000Z | [
"region:us"
] | umarigan | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: title
dtype: string
- name: ner
list:
- name: end
dtype: int64
- name: entity
dtype: string
- name: index
dtype: int64
- name: score
dtype: float32
- name: start
dtype: int64
- name: word
dtype: string
- name: cleaned_ners
sequence: string
- name: cleaned_new
sequence: string
splits:
- name: train
num_bytes: 1781032869
num_examples: 265000
download_size: 698313289
dataset_size: 1781032869
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "turkish_wikipedia_dataset_NER"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Severian/Bio-Design-Process | 2023-10-10T17:12:41.000Z | [
"license:apache-2.0",
"region:us"
] | Severian | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf | 2023-10-10T17:26:58.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lizpreciatior/lzlv_70b_fp16_hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lizpreciatior/lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T17:25:31.421123](https://huggingface.co/datasets/open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf/blob/main/results_2023-10-10T17-25-31.421123.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7014601877451904,\n\
\ \"acc_stderr\": 0.030810743851904177,\n \"acc_norm\": 0.7052115028528482,\n\
\ \"acc_norm_stderr\": 0.03078110062605054,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314757,\n \"mc2\": 0.6048969510737517,\n\
\ \"mc2_stderr\": 0.01503413923996154\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902279,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.013374078615068744\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6924915355506871,\n\
\ \"acc_stderr\": 0.004605187195197424,\n \"acc_norm\": 0.8754232224656443,\n\
\ \"acc_norm_stderr\": 0.0032956349076664645\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7106382978723405,\n \"acc_stderr\": 0.02964400657700962,\n\
\ \"acc_norm\": 0.7106382978723405,\n \"acc_norm_stderr\": 0.02964400657700962\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.455026455026455,\n \"acc_stderr\": 0.025646928361049398,\n \"\
acc_norm\": 0.455026455026455,\n \"acc_norm_stderr\": 0.025646928361049398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.022037217340267826,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.022037217340267826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.035107665979592154,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.035107665979592154\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983134,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983134\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8939393939393939,\n \"acc_stderr\": 0.021938047738853102,\n \"\
acc_norm\": 0.8939393939393939,\n \"acc_norm_stderr\": 0.021938047738853102\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465942,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465942\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4900662251655629,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.4900662251655629,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"\
acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8945147679324894,\n \"acc_stderr\": 0.01999556072375854,\n \
\ \"acc_norm\": 0.8945147679324894,\n \"acc_norm_stderr\": 0.01999556072375854\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.030884661089515375,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.030884661089515375\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.03602814176392645,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.03602814176392645\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n\
\ \"acc_stderr\": 0.01831589168562585,\n \"acc_norm\": 0.9145299145299145,\n\
\ \"acc_norm_stderr\": 0.01831589168562585\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8671775223499362,\n\
\ \"acc_stderr\": 0.012136303209884564,\n \"acc_norm\": 0.8671775223499362,\n\
\ \"acc_norm_stderr\": 0.012136303209884564\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7803468208092486,\n \"acc_stderr\": 0.02228963885261789,\n\
\ \"acc_norm\": 0.7803468208092486,\n \"acc_norm_stderr\": 0.02228963885261789\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5385474860335195,\n\
\ \"acc_stderr\": 0.01667273126755225,\n \"acc_norm\": 0.5385474860335195,\n\
\ \"acc_norm_stderr\": 0.01667273126755225\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597256,\n\
\ \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597256\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5673758865248227,\n \"acc_stderr\": 0.02955545423677884,\n \
\ \"acc_norm\": 0.5673758865248227,\n \"acc_norm_stderr\": 0.02955545423677884\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5475880052151239,\n\
\ \"acc_stderr\": 0.012712265105889138,\n \"acc_norm\": 0.5475880052151239,\n\
\ \"acc_norm_stderr\": 0.012712265105889138\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114948,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114948\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7630718954248366,\n \"acc_stderr\": 0.01720166216978978,\n \
\ \"acc_norm\": 0.7630718954248366,\n \"acc_norm_stderr\": 0.01720166216978978\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.041723430387053825,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.041723430387053825\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8204081632653061,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.8204081632653061,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314757,\n \"mc2\": 0.6048969510737517,\n\
\ \"mc2_stderr\": 0.01503413923996154\n }\n}\n```"
repo_url: https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-25-31.421123.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-25-31.421123.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-25-31.421123.parquet'
- config_name: results
data_files:
- split: 2023_10_10T17_25_31.421123
path:
- results_2023-10-10T17-25-31.421123.parquet
- split: latest
path:
- results_2023-10-10T17-25-31.421123.parquet
---
# Dataset Card for Evaluation run of lizpreciatior/lzlv_70b_fp16_hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lizpreciatior/lzlv_70b_fp16_hf](https://huggingface.co/lizpreciatior/lzlv_70b_fp16_hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T17:25:31.421123](https://huggingface.co/datasets/open-llm-leaderboard/details_lizpreciatior__lzlv_70b_fp16_hf/blob/main/results_2023-10-10T17-25-31.421123.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7014601877451904,
"acc_stderr": 0.030810743851904177,
"acc_norm": 0.7052115028528482,
"acc_norm_stderr": 0.03078110062605054,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314757,
"mc2": 0.6048969510737517,
"mc2_stderr": 0.01503413923996154
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902279,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.013374078615068744
},
"harness|hellaswag|10": {
"acc": 0.6924915355506871,
"acc_stderr": 0.004605187195197424,
"acc_norm": 0.8754232224656443,
"acc_norm_stderr": 0.0032956349076664645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7106382978723405,
"acc_stderr": 0.02964400657700962,
"acc_norm": 0.7106382978723405,
"acc_norm_stderr": 0.02964400657700962
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.455026455026455,
"acc_stderr": 0.025646928361049398,
"acc_norm": 0.455026455026455,
"acc_norm_stderr": 0.025646928361049398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267826,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.035107665979592154,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.035107665979592154
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983134,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983134
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8939393939393939,
"acc_stderr": 0.021938047738853102,
"acc_norm": 0.8939393939393939,
"acc_norm_stderr": 0.021938047738853102
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465942,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465942
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4900662251655629,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.4900662251655629,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8945147679324894,
"acc_stderr": 0.01999556072375854,
"acc_norm": 0.8945147679324894,
"acc_norm_stderr": 0.01999556072375854
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.030884661089515375,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.030884661089515375
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.03602814176392645,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.03602814176392645
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.01831589168562585,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.01831589168562585
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8671775223499362,
"acc_stderr": 0.012136303209884564,
"acc_norm": 0.8671775223499362,
"acc_norm_stderr": 0.012136303209884564
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7803468208092486,
"acc_stderr": 0.02228963885261789,
"acc_norm": 0.7803468208092486,
"acc_norm_stderr": 0.02228963885261789
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5385474860335195,
"acc_stderr": 0.01667273126755225,
"acc_norm": 0.5385474860335195,
"acc_norm_stderr": 0.01667273126755225
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.02418515064781871,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.02418515064781871
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597256,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597256
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5673758865248227,
"acc_stderr": 0.02955545423677884,
"acc_norm": 0.5673758865248227,
"acc_norm_stderr": 0.02955545423677884
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5475880052151239,
"acc_stderr": 0.012712265105889138,
"acc_norm": 0.5475880052151239,
"acc_norm_stderr": 0.012712265105889138
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114948,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114948
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7630718954248366,
"acc_stderr": 0.01720166216978978,
"acc_norm": 0.7630718954248366,
"acc_norm_stderr": 0.01720166216978978
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.041723430387053825,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.041723430387053825
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8204081632653061,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.8204081632653061,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314757,
"mc2": 0.6048969510737517,
"mc2_stderr": 0.01503413923996154
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
selinerdem/german-orca | 2023-10-10T17:32:39.000Z | [
"region:us"
] | selinerdem | null | null | null | 0 | 0 | Entry not found |
asmallgreenpotato/test-unordered | 2023-10-10T17:33:27.000Z | [
"region:us"
] | asmallgreenpotato | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b | 2023-10-10T17:33:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Doctor-Shotgun/mythospice-limarp-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Doctor-Shotgun/mythospice-limarp-70b](https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T17:32:09.949446](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b/blob/main/results_2023-10-10T17-32-09.949446.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7003381131853825,\n\
\ \"acc_stderr\": 0.030868016431885005,\n \"acc_norm\": 0.7041542943988105,\n\
\ \"acc_norm_stderr\": 0.03083872039971206,\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5585587782114779,\n\
\ \"mc2_stderr\": 0.014924290521969901\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6544368600682594,\n \"acc_stderr\": 0.013896938461145677,\n\
\ \"acc_norm\": 0.6919795221843004,\n \"acc_norm_stderr\": 0.013491429517292038\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6870145389364668,\n\
\ \"acc_stderr\": 0.004627607991626915,\n \"acc_norm\": 0.8746265684126668,\n\
\ \"acc_norm_stderr\": 0.003304651037276554\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6680851063829787,\n \"acc_stderr\": 0.03078373675774565,\n\
\ \"acc_norm\": 0.6680851063829787,\n \"acc_norm_stderr\": 0.03078373675774565\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.025620857042936655,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.025620857042936655\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514587,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514587\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n\
\ \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.02323458108842849,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.02323458108842849\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n\
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827948,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"\
acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8899082568807339,\n \"acc_stderr\": 0.013419939018681203,\n \"\
acc_norm\": 0.8899082568807339,\n \"acc_norm_stderr\": 0.013419939018681203\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073312,\n \"\
acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884562,\n \
\ \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884562\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.7982062780269058,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"\
acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n\
\ \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n\
\ \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7976878612716763,\n \"acc_stderr\": 0.02162807738019612,\n\
\ \"acc_norm\": 0.7976878612716763,\n \"acc_norm_stderr\": 0.02162807738019612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5374301675977654,\n\
\ \"acc_stderr\": 0.016675578687308085,\n \"acc_norm\": 0.5374301675977654,\n\
\ \"acc_norm_stderr\": 0.016675578687308085\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.02428861946604611,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02428861946604611\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.02418515064781871,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.02418515064781871\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n\
\ \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5508474576271186,\n\
\ \"acc_stderr\": 0.012704030518851474,\n \"acc_norm\": 0.5508474576271186,\n\
\ \"acc_norm_stderr\": 0.012704030518851474\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n\
\ \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7565359477124183,\n \"acc_stderr\": 0.017362473762146616,\n \
\ \"acc_norm\": 0.7565359477124183,\n \"acc_norm_stderr\": 0.017362473762146616\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7454545454545455,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.7454545454545455,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8040816326530612,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.8040816326530612,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.0211662163046594,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.0211662163046594\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n\
\ \"mc1_stderr\": 0.017124930942023518,\n \"mc2\": 0.5585587782114779,\n\
\ \"mc2_stderr\": 0.014924290521969901\n }\n}\n```"
repo_url: https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-32-09.949446.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-32-09.949446.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-32-09.949446.parquet'
- config_name: results
data_files:
- split: 2023_10_10T17_32_09.949446
path:
- results_2023-10-10T17-32-09.949446.parquet
- split: latest
path:
- results_2023-10-10T17-32-09.949446.parquet
---
# Dataset Card for Evaluation run of Doctor-Shotgun/mythospice-limarp-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Doctor-Shotgun/mythospice-limarp-70b](https://huggingface.co/Doctor-Shotgun/mythospice-limarp-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T17:32:09.949446](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-limarp-70b/blob/main/results_2023-10-10T17-32-09.949446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7003381131853825,
"acc_stderr": 0.030868016431885005,
"acc_norm": 0.7041542943988105,
"acc_norm_stderr": 0.03083872039971206,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5585587782114779,
"mc2_stderr": 0.014924290521969901
},
"harness|arc:challenge|25": {
"acc": 0.6544368600682594,
"acc_stderr": 0.013896938461145677,
"acc_norm": 0.6919795221843004,
"acc_norm_stderr": 0.013491429517292038
},
"harness|hellaswag|10": {
"acc": 0.6870145389364668,
"acc_stderr": 0.004627607991626915,
"acc_norm": 0.8746265684126668,
"acc_norm_stderr": 0.003304651037276554
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6680851063829787,
"acc_stderr": 0.03078373675774565,
"acc_norm": 0.6680851063829787,
"acc_norm_stderr": 0.03078373675774565
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.025620857042936655,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.025620857042936655
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7,
"acc_stderr": 0.02323458108842849,
"acc_norm": 0.7,
"acc_norm_stderr": 0.02323458108842849
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827948,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8899082568807339,
"acc_stderr": 0.013419939018681203,
"acc_norm": 0.8899082568807339,
"acc_norm_stderr": 0.013419939018681203
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073312,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884562,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884562
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7976878612716763,
"acc_stderr": 0.02162807738019612,
"acc_norm": 0.7976878612716763,
"acc_norm_stderr": 0.02162807738019612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5374301675977654,
"acc_stderr": 0.016675578687308085,
"acc_norm": 0.5374301675977654,
"acc_norm_stderr": 0.016675578687308085
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02428861946604611,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02428861946604611
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.02418515064781871,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.02418515064781871
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385717,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385717
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5508474576271186,
"acc_stderr": 0.012704030518851474,
"acc_norm": 0.5508474576271186,
"acc_norm_stderr": 0.012704030518851474
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7565359477124183,
"acc_stderr": 0.017362473762146616,
"acc_norm": 0.7565359477124183,
"acc_norm_stderr": 0.017362473762146616
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8040816326530612,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.8040816326530612,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.0211662163046594,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.0211662163046594
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023518,
"mc2": 0.5585587782114779,
"mc2_stderr": 0.014924290521969901
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b | 2023-10-10T17:35:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Doctor-Shotgun/mythospice-70b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Doctor-Shotgun/mythospice-70b](https://huggingface.co/Doctor-Shotgun/mythospice-70b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T17:34:08.268208](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b/blob/main/results_2023-10-10T17-34-08.268208.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6999701730200566,\n\
\ \"acc_stderr\": 0.030791989037421433,\n \"acc_norm\": 0.7037692455496659,\n\
\ \"acc_norm_stderr\": 0.030762798650310546,\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.5675734205494971,\n\
\ \"mc2_stderr\": 0.01492879413688994\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6569965870307167,\n \"acc_stderr\": 0.013872423223718166,\n\
\ \"acc_norm\": 0.6928327645051194,\n \"acc_norm_stderr\": 0.013481034054980941\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6870145389364668,\n\
\ \"acc_stderr\": 0.004627607991626915,\n \"acc_norm\": 0.8753236407090221,\n\
\ \"acc_norm_stderr\": 0.003296764320821919\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8157894736842105,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.8157894736842105,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062947,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062947\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.676595744680851,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.676595744680851,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777028,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777028\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.022037217340267833,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.022037217340267833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.02293992541853062,\n \
\ \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.02293992541853062\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827948,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"\
acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8917431192660551,\n \"acc_stderr\": 0.01332134844761174,\n \"\
acc_norm\": 0.8917431192660551,\n \"acc_norm_stderr\": 0.01332134844761174\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878463,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878463\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n\
\ \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n\
\ \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924974,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8633461047254151,\n\
\ \"acc_stderr\": 0.012282876868629234,\n \"acc_norm\": 0.8633461047254151,\n\
\ \"acc_norm_stderr\": 0.012282876868629234\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n\
\ \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.529608938547486,\n\
\ \"acc_stderr\": 0.016693154927383547,\n \"acc_norm\": 0.529608938547486,\n\
\ \"acc_norm_stderr\": 0.016693154927383547\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n\
\ \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7620578778135049,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.7620578778135049,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n\
\ \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \
\ \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5514993481095176,\n\
\ \"acc_stderr\": 0.012702317490559825,\n \"acc_norm\": 0.5514993481095176,\n\
\ \"acc_norm_stderr\": 0.012702317490559825\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.027257202606114948,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.027257202606114948\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.017401816711427657,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.017401816711427657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7363636363636363,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.7363636363636363,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n\
\ \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9054726368159204,\n\
\ \"acc_stderr\": 0.020687186951534094,\n \"acc_norm\": 0.9054726368159204,\n\
\ \"acc_norm_stderr\": 0.020687186951534094\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.5675734205494971,\n\
\ \"mc2_stderr\": 0.01492879413688994\n }\n}\n```"
repo_url: https://huggingface.co/Doctor-Shotgun/mythospice-70b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T17-34-08.268208.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-34-08.268208.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T17-34-08.268208.parquet'
- config_name: results
data_files:
- split: 2023_10_10T17_34_08.268208
path:
- results_2023-10-10T17-34-08.268208.parquet
- split: latest
path:
- results_2023-10-10T17-34-08.268208.parquet
---
# Dataset Card for Evaluation run of Doctor-Shotgun/mythospice-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Doctor-Shotgun/mythospice-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Doctor-Shotgun/mythospice-70b](https://huggingface.co/Doctor-Shotgun/mythospice-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T17:34:08.268208](https://huggingface.co/datasets/open-llm-leaderboard/details_Doctor-Shotgun__mythospice-70b/blob/main/results_2023-10-10T17-34-08.268208.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6999701730200566,
"acc_stderr": 0.030791989037421433,
"acc_norm": 0.7037692455496659,
"acc_norm_stderr": 0.030762798650310546,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.5675734205494971,
"mc2_stderr": 0.01492879413688994
},
"harness|arc:challenge|25": {
"acc": 0.6569965870307167,
"acc_stderr": 0.013872423223718166,
"acc_norm": 0.6928327645051194,
"acc_norm_stderr": 0.013481034054980941
},
"harness|hellaswag|10": {
"acc": 0.6870145389364668,
"acc_stderr": 0.004627607991626915,
"acc_norm": 0.8753236407090221,
"acc_norm_stderr": 0.003296764320821919
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8157894736842105,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.8157894736842105,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062947,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062947
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.676595744680851,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.676595744680851,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777028,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777028
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.02293992541853062,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.02293992541853062
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827948,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8917431192660551,
"acc_stderr": 0.01332134844761174,
"acc_norm": 0.8917431192660551,
"acc_norm_stderr": 0.01332134844761174
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878463,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878463
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924974,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8633461047254151,
"acc_stderr": 0.012282876868629234,
"acc_norm": 0.8633461047254151,
"acc_norm_stderr": 0.012282876868629234
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.529608938547486,
"acc_stderr": 0.016693154927383547,
"acc_norm": 0.529608938547486,
"acc_norm_stderr": 0.016693154927383547
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7620578778135049,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.7620578778135049,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.845679012345679,
"acc_stderr": 0.020100830999850994,
"acc_norm": 0.845679012345679,
"acc_norm_stderr": 0.020100830999850994
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5514993481095176,
"acc_stderr": 0.012702317490559825,
"acc_norm": 0.5514993481095176,
"acc_norm_stderr": 0.012702317490559825
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.027257202606114948,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.027257202606114948
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.017401816711427657,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.017401816711427657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7363636363636363,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.7363636363636363,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9054726368159204,
"acc_stderr": 0.020687186951534094,
"acc_norm": 0.9054726368159204,
"acc_norm_stderr": 0.020687186951534094
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.5675734205494971,
"mc2_stderr": 0.01492879413688994
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Jagadeesh-ti/sql_v2 | 2023-10-10T17:50:46.000Z | [
"region:us"
] | Jagadeesh-ti | null | null | null | 0 | 0 | Entry not found |
Abira1/finance-alpaca-v2 | 2023-10-10T17:43:58.000Z | [
"region:us"
] | Abira1 | null | null | null | 0 | 0 | Entry not found |
superdinosauro/GokuClaudio | 2023-10-10T17:48:36.000Z | [
"license:openrail",
"region:us"
] | superdinosauro | null | null | null | 0 | 0 | ---
license: openrail
---
|
Satooo123/embauche | 2023-10-10T17:48:52.000Z | [
"region:us"
] | Satooo123 | null | null | null | 0 | 0 | Entry not found |
Satyanshu404/trec-cast-2019 | 2023-10-10T17:58:02.000Z | [
"license:mit",
"region:us"
] | Satyanshu404 | null | null | null | 0 | 0 | ---
license: mit
---
|
ligi2009/generate | 2023-10-10T18:22:17.000Z | [
"region:us"
] | ligi2009 | null | null | null | 0 | 0 | Entry not found |
BHARAT9983/lama_demo_data | 2023-10-10T18:28:23.000Z | [
"region:us"
] | BHARAT9983 | null | null | null | 0 | 0 | demo data |
namthai-dev/my-dataset | 2023-10-10T18:24:38.000Z | [
"region:us"
] | namthai-dev | null | null | null | 0 | 0 | Entry not found |
Mouli07/ROCO_Chest_Xray_v2 | 2023-10-10T19:24:21.000Z | [
"region:us"
] | Mouli07 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 4079190365.452
num_examples: 140612
download_size: 3847824662
dataset_size: 4079190365.452
---
# Dataset Card for "ROCO_Chest_Xray_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emrgnt-cmplxty/rag-textbook-instruct | 2023-10-10T18:45:12.000Z | [
"region:us"
] | emrgnt-cmplxty | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: formatted_prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 865048
num_examples: 32
download_size: 362078
dataset_size: 865048
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rag-textbook-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PTMalatji/mC4SA | 2023-10-10T18:45:39.000Z | [
"region:us"
] | PTMalatji | null | null | null | 0 | 0 | Entry not found |
TanvirOnHF/genders | 2023-10-10T18:48:40.000Z | [
"license:cdla-sharing-1.0",
"region:us"
] | TanvirOnHF | null | null | null | 0 | 0 | ---
license: cdla-sharing-1.0
---
|
emrgnt-cmplxty/rag-textbook-instruct-full | 2023-10-11T00:21:19.000Z | [
"region:us"
] | emrgnt-cmplxty | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: formatted_prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 139167682
num_examples: 12896
download_size: 44135115
dataset_size: 139167682
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "rag-textbook-instruct-full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BHARAT9983/lamav2_newdata | 2023-10-10T18:57:04.000Z | [
"region:us"
] | BHARAT9983 | null | null | null | 0 | 0 | |
TanvirOnHF/ProsePoems | 2023-10-10T19:02:16.000Z | [
"license:cdla-sharing-1.0",
"region:us"
] | TanvirOnHF | null | null | null | 0 | 0 | ---
license: cdla-sharing-1.0
---
|
babananabananana/long_lat_maps | 2023-10-10T19:31:00.000Z | [
"region:us"
] | babananabananana | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: index
dtype: int64
- name: latitude
dtype: float64
- name: longitude
dtype: float64
splits:
- name: train
num_bytes: 2065210484.388
num_examples: 24702
download_size: 1978578632
dataset_size: 2065210484.388
---
# Dataset Card for "long_lat_maps"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2 | 2023-10-10T19:15:47.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ICBU-NPU/FashionGPT-70B-V1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ICBU-NPU/FashionGPT-70B-V1.2](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T19:14:20.366315](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2/blob/main/results_2023-10-10T19-14-20.366315.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7009383310033525,\n\
\ \"acc_stderr\": 0.031129232412069757,\n \"acc_norm\": 0.7046677418112732,\n\
\ \"acc_norm_stderr\": 0.031098326933095735,\n \"mc1\": 0.45777233782129745,\n\
\ \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6514599479049492,\n\
\ \"mc2_stderr\": 0.014944495023231023\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6979522184300341,\n \"acc_stderr\": 0.013417519144716424,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869159\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6938856801433977,\n\
\ \"acc_stderr\": 0.0045993589209095305,\n \"acc_norm\": 0.8814977096195977,\n\
\ \"acc_norm_stderr\": 0.003225414119289709\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266344,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266344\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380042,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380042\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47354497354497355,\n \"acc_stderr\": 0.025715239811346755,\n \"\
acc_norm\": 0.47354497354497355,\n \"acc_norm_stderr\": 0.025715239811346755\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n\
\ \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n\
\ \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\"\
: 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n\
\ \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.02329088805377272,\n \
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.02329088805377272\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827947,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827947\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.45695364238410596,\n \"acc_stderr\": 0.04067325174247443,\n \"\
acc_norm\": 0.45695364238410596,\n \"acc_norm_stderr\": 0.04067325174247443\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"\
acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6203703703703703,\n \"acc_stderr\": 0.03309682581119035,\n \"\
acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.03309682581119035\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9166666666666666,\n \"acc_stderr\": 0.019398452135813905,\n \"\
acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.019398452135813905\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \
\ \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8071748878923767,\n\
\ \"acc_stderr\": 0.026478240960489365,\n \"acc_norm\": 0.8071748878923767,\n\
\ \"acc_norm_stderr\": 0.026478240960489365\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744633,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744633\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n\
\ \"acc_stderr\": 0.04718471485219587,\n \"acc_norm\": 0.5535714285714286,\n\
\ \"acc_norm_stderr\": 0.04718471485219587\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.034926064766237906,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.034926064766237906\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856491,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856491\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8034682080924855,\n \"acc_stderr\": 0.02139396140436385,\n\
\ \"acc_norm\": 0.8034682080924855,\n \"acc_norm_stderr\": 0.02139396140436385\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5899441340782123,\n\
\ \"acc_stderr\": 0.01644970820902608,\n \"acc_norm\": 0.5899441340782123,\n\
\ \"acc_norm_stderr\": 0.01644970820902608\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7745098039215687,\n \"acc_stderr\": 0.02392915551735129,\n\
\ \"acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.02392915551735129\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7556270096463023,\n\
\ \"acc_stderr\": 0.024406162094668893,\n \"acc_norm\": 0.7556270096463023,\n\
\ \"acc_norm_stderr\": 0.024406162094668893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8024691358024691,\n \"acc_stderr\": 0.02215288992789897,\n\
\ \"acc_norm\": 0.8024691358024691,\n \"acc_norm_stderr\": 0.02215288992789897\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5851063829787234,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.5851063829787234,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5580182529335072,\n\
\ \"acc_stderr\": 0.01268397251359883,\n \"acc_norm\": 0.5580182529335072,\n\
\ \"acc_norm_stderr\": 0.01268397251359883\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7549019607843137,\n \"acc_stderr\": 0.01740181671142765,\n \
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.01740181671142765\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7959183673469388,\n \"acc_stderr\": 0.02580128347509049,\n\
\ \"acc_norm\": 0.7959183673469388,\n \"acc_norm_stderr\": 0.02580128347509049\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n\
\ \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45777233782129745,\n\
\ \"mc1_stderr\": 0.01744096571248212,\n \"mc2\": 0.6514599479049492,\n\
\ \"mc2_stderr\": 0.014944495023231023\n }\n}\n```"
repo_url: https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-14-20.366315.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-14-20.366315.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-14-20.366315.parquet'
- config_name: results
data_files:
- split: 2023_10_10T19_14_20.366315
path:
- results_2023-10-10T19-14-20.366315.parquet
- split: latest
path:
- results_2023-10-10T19-14-20.366315.parquet
---
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1.2](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T19:14:20.366315](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.2/blob/main/results_2023-10-10T19-14-20.366315.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7009383310033525,
"acc_stderr": 0.031129232412069757,
"acc_norm": 0.7046677418112732,
"acc_norm_stderr": 0.031098326933095735,
"mc1": 0.45777233782129745,
"mc1_stderr": 0.01744096571248212,
"mc2": 0.6514599479049492,
"mc2_stderr": 0.014944495023231023
},
"harness|arc:challenge|25": {
"acc": 0.6979522184300341,
"acc_stderr": 0.013417519144716424,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869159
},
"harness|hellaswag|10": {
"acc": 0.6938856801433977,
"acc_stderr": 0.0045993589209095305,
"acc_norm": 0.8814977096195977,
"acc_norm_stderr": 0.003225414119289709
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.03396116205845335,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.03396116205845335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266344,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266344
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47354497354497355,
"acc_stderr": 0.025715239811346755,
"acc_norm": 0.47354497354497355,
"acc_norm_stderr": 0.025715239811346755
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.02329088805377272,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.02329088805377272
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827947,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827947
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.45695364238410596,
"acc_stderr": 0.04067325174247443,
"acc_norm": 0.45695364238410596,
"acc_norm_stderr": 0.04067325174247443
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.019398452135813905,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.019398452135813905
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8071748878923767,
"acc_stderr": 0.026478240960489365,
"acc_norm": 0.8071748878923767,
"acc_norm_stderr": 0.026478240960489365
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744633,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744633
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.04718471485219587,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.04718471485219587
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.034926064766237906,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.034926064766237906
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856491,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8034682080924855,
"acc_stderr": 0.02139396140436385,
"acc_norm": 0.8034682080924855,
"acc_norm_stderr": 0.02139396140436385
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5899441340782123,
"acc_stderr": 0.01644970820902608,
"acc_norm": 0.5899441340782123,
"acc_norm_stderr": 0.01644970820902608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7556270096463023,
"acc_stderr": 0.024406162094668893,
"acc_norm": 0.7556270096463023,
"acc_norm_stderr": 0.024406162094668893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8024691358024691,
"acc_stderr": 0.02215288992789897,
"acc_norm": 0.8024691358024691,
"acc_norm_stderr": 0.02215288992789897
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5851063829787234,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.5851063829787234,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5580182529335072,
"acc_stderr": 0.01268397251359883,
"acc_norm": 0.5580182529335072,
"acc_norm_stderr": 0.01268397251359883
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.01740181671142765,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.01740181671142765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7959183673469388,
"acc_stderr": 0.02580128347509049,
"acc_norm": 0.7959183673469388,
"acc_norm_stderr": 0.02580128347509049
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45777233782129745,
"mc1_stderr": 0.01744096571248212,
"mc2": 0.6514599479049492,
"mc2_stderr": 0.014944495023231023
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
gus013666/minhavoz | 2023-10-10T20:53:20.000Z | [
"license:openrail",
"region:us"
] | gus013666 | null | null | null | 0 | 0 | ---
license: openrail
---
|
Waterfront/social-media-captions-10k | 2023-10-10T19:31:08.000Z | [
"region:us"
] | Waterfront | null | null | null | 0 | 0 | Entry not found |
Waterfront/social-media-captions-20k | 2023-10-10T19:30:08.000Z | [
"region:us"
] | Waterfront | null | null | null | 0 | 0 | Entry not found |
emrgnt-cmplxty/synth-textbook-instruct-full | 2023-10-11T00:23:21.000Z | [
"region:us"
] | emrgnt-cmplxty | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: formatted_prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 30512920
num_examples: 3869
download_size: 10986151
dataset_size: 30512920
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "textbook-instruct-full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
brunomaribeiro/ts_lyricsgenerationdataset | 2023-10-10T20:10:15.000Z | [
"region:us"
] | brunomaribeiro | null | null | null | 0 | 0 | Entry not found |
BubbleJoe/multi_nli_unified_input | 2023-10-10T19:45:11.000Z | [
"region:us"
] | BubbleJoe | null | null | null | 1 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation_matched
path: data/validation_matched-*
- split: validation_mismatched
path: data/validation_mismatched-*
dataset_info:
features:
- name: promptID
dtype: int32
- name: pairID
dtype: string
- name: premise
dtype: string
- name: premise_binary_parse
dtype: string
- name: premise_parse
dtype: string
- name: hypothesis
dtype: string
- name: hypothesis_binary_parse
dtype: string
- name: hypothesis_parse
dtype: string
- name: genre
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: input
dtype: string
splits:
- name: train
num_bytes: 487186164
num_examples: 392702
- name: validation_matched
num_bytes: 11956580
num_examples: 9815
- name: validation_mismatched
num_bytes: 12618412
num_examples: 9832
download_size: 272284496
dataset_size: 511761156
---
# Dataset Card for "multi_nli_unified_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2 | 2023-10-10T19:40:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of euclaise/falcon_1b_stage3_2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [euclaise/falcon_1b_stage3_2](https://huggingface.co/euclaise/falcon_1b_stage3_2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T19:39:31.631601](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2/blob/main/results_2023-10-10T19-39-31.631601.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24357280395409467,\n\
\ \"acc_stderr\": 0.031033910699786692,\n \"acc_norm\": 0.24632860680200963,\n\
\ \"acc_norm_stderr\": 0.031038146506837786,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474209,\n \"mc2\": 0.3988575374907098,\n\
\ \"mc2_stderr\": 0.014942131299167513\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3174061433447099,\n \"acc_stderr\": 0.01360223908803817,\n\
\ \"acc_norm\": 0.3455631399317406,\n \"acc_norm_stderr\": 0.013896938461145685\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4493128858793069,\n\
\ \"acc_stderr\": 0.004964075870120336,\n \"acc_norm\": 0.5837482573192591,\n\
\ \"acc_norm_stderr\": 0.0049192891130275035\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2037735849056604,\n \"acc_stderr\": 0.024790784501775395,\n\
\ \"acc_norm\": 0.2037735849056604,\n \"acc_norm_stderr\": 0.024790784501775395\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641143,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641143\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149353,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149353\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.02732107841738753,\n\
\ \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.02732107841738753\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281337,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281337\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.1984126984126984,\n \"acc_stderr\": 0.02053948126188688,\n \"\
acc_norm\": 0.1984126984126984,\n \"acc_norm_stderr\": 0.02053948126188688\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2064516129032258,\n \"acc_stderr\": 0.02302589961718872,\n \"\
acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.02302589961718872\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.17733990147783252,\n \"acc_stderr\": 0.026874337276808352,\n \"\
acc_norm\": 0.17733990147783252,\n \"acc_norm_stderr\": 0.026874337276808352\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23834196891191708,\n \"acc_stderr\": 0.03074890536390989,\n\
\ \"acc_norm\": 0.23834196891191708,\n \"acc_norm_stderr\": 0.03074890536390989\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222724,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222724\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2074074074074074,\n \"acc_stderr\": 0.024720713193952165,\n \
\ \"acc_norm\": 0.2074074074074074,\n \"acc_norm_stderr\": 0.024720713193952165\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.18907563025210083,\n \"acc_stderr\": 0.02543511943810535,\n\
\ \"acc_norm\": 0.18907563025210083,\n \"acc_norm_stderr\": 0.02543511943810535\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.032578473844367774,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.032578473844367774\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1574074074074074,\n \"acc_stderr\": 0.024837173518242384,\n \"\
acc_norm\": 0.1574074074074074,\n \"acc_norm_stderr\": 0.024837173518242384\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.30392156862745096,\n \"acc_stderr\": 0.032282103870378935,\n \"\
acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.032282103870378935\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291943,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291943\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.26993865030674846,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.26993865030674846,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1553398058252427,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.1553398058252427,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532337,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.02342037547829613,\n\
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.02342037547829613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2572347266881029,\n\
\ \"acc_stderr\": 0.024826171289250888,\n \"acc_norm\": 0.2572347266881029,\n\
\ \"acc_norm_stderr\": 0.024826171289250888\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023132376234543325,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023132376234543325\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23663624511082137,\n\
\ \"acc_stderr\": 0.01085513735157274,\n \"acc_norm\": 0.23663624511082137,\n\
\ \"acc_norm_stderr\": 0.01085513735157274\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.23161764705882354,\n \"acc_stderr\": 0.025626533803777562,\n\
\ \"acc_norm\": 0.23161764705882354,\n \"acc_norm_stderr\": 0.025626533803777562\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24509803921568626,\n \"acc_stderr\": 0.017401816711427657,\n \
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.017401816711427657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2636363636363636,\n\
\ \"acc_stderr\": 0.04220224692971987,\n \"acc_norm\": 0.2636363636363636,\n\
\ \"acc_norm_stderr\": 0.04220224692971987\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19183673469387755,\n \"acc_stderr\": 0.025206963154225395,\n\
\ \"acc_norm\": 0.19183673469387755,\n \"acc_norm_stderr\": 0.025206963154225395\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.034605799075530276,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.034605799075530276\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2982456140350877,\n \"acc_stderr\": 0.03508771929824565,\n\
\ \"acc_norm\": 0.2982456140350877,\n \"acc_norm_stderr\": 0.03508771929824565\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474209,\n \"mc2\": 0.3988575374907098,\n\
\ \"mc2_stderr\": 0.014942131299167513\n }\n}\n```"
repo_url: https://huggingface.co/euclaise/falcon_1b_stage3_2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-39-31.631601.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-39-31.631601.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-39-31.631601.parquet'
- config_name: results
data_files:
- split: 2023_10_10T19_39_31.631601
path:
- results_2023-10-10T19-39-31.631601.parquet
- split: latest
path:
- results_2023-10-10T19-39-31.631601.parquet
---
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage3_2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/falcon_1b_stage3_2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage3_2](https://huggingface.co/euclaise/falcon_1b_stage3_2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T19:39:31.631601](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage3_2/blob/main/results_2023-10-10T19-39-31.631601.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24357280395409467,
"acc_stderr": 0.031033910699786692,
"acc_norm": 0.24632860680200963,
"acc_norm_stderr": 0.031038146506837786,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474209,
"mc2": 0.3988575374907098,
"mc2_stderr": 0.014942131299167513
},
"harness|arc:challenge|25": {
"acc": 0.3174061433447099,
"acc_stderr": 0.01360223908803817,
"acc_norm": 0.3455631399317406,
"acc_norm_stderr": 0.013896938461145685
},
"harness|hellaswag|10": {
"acc": 0.4493128858793069,
"acc_stderr": 0.004964075870120336,
"acc_norm": 0.5837482573192591,
"acc_norm_stderr": 0.0049192891130275035
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2037735849056604,
"acc_stderr": 0.024790784501775395,
"acc_norm": 0.2037735849056604,
"acc_norm_stderr": 0.024790784501775395
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641143,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641143
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149353,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149353
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.02732107841738753,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.02732107841738753
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281337,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281337
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.02053948126188688,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.02053948126188688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.02302589961718872,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.02302589961718872
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.17733990147783252,
"acc_stderr": 0.026874337276808352,
"acc_norm": 0.17733990147783252,
"acc_norm_stderr": 0.026874337276808352
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23834196891191708,
"acc_stderr": 0.03074890536390989,
"acc_norm": 0.23834196891191708,
"acc_norm_stderr": 0.03074890536390989
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222724,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.024720713193952165,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.024720713193952165
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.18907563025210083,
"acc_stderr": 0.02543511943810535,
"acc_norm": 0.18907563025210083,
"acc_norm_stderr": 0.02543511943810535
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.032578473844367774,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.032578473844367774
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1574074074074074,
"acc_stderr": 0.024837173518242384,
"acc_norm": 0.1574074074074074,
"acc_norm_stderr": 0.024837173518242384
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.032282103870378935,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.032282103870378935
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291943,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291943
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26993865030674846,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.26993865030674846,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.1553398058252427,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.1553398058252427,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2572347266881029,
"acc_stderr": 0.024826171289250888,
"acc_norm": 0.2572347266881029,
"acc_norm_stderr": 0.024826171289250888
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023132376234543325,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023132376234543325
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23663624511082137,
"acc_stderr": 0.01085513735157274,
"acc_norm": 0.23663624511082137,
"acc_norm_stderr": 0.01085513735157274
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.23161764705882354,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.23161764705882354,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.017401816711427657,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.017401816711427657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2636363636363636,
"acc_stderr": 0.04220224692971987,
"acc_norm": 0.2636363636363636,
"acc_norm_stderr": 0.04220224692971987
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.034605799075530276,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.034605799075530276
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824565,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824565
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474209,
"mc2": 0.3988575374907098,
"mc2_stderr": 0.014942131299167513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
MaxReynolds/SouderRocketLauncherReference | 2023-10-10T20:20:07.000Z | [
"region:us"
] | MaxReynolds | null | null | null | 0 | 0 | Entry not found |
socjopata99/elo | 2023-10-10T20:03:43.000Z | [
"license:apache-2.0",
"region:us"
] | socjopata99 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0 | 2023-10-10T20:00:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of llm-agents/tora-code-34b-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [llm-agents/tora-code-34b-v1.0](https://huggingface.co/llm-agents/tora-code-34b-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T19:58:46.874384](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0/blob/main/results_2023-10-10T19-58-46.874384.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46960448692665785,\n\
\ \"acc_stderr\": 0.03519478895140827,\n \"acc_norm\": 0.4732830325230917,\n\
\ \"acc_norm_stderr\": 0.03518407016477708,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.3966399813178778,\n\
\ \"mc2_stderr\": 0.015001622827420584\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4735494880546075,\n \"acc_stderr\": 0.014590931358120174,\n\
\ \"acc_norm\": 0.5042662116040956,\n \"acc_norm_stderr\": 0.014610858923956952\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5691097390957977,\n\
\ \"acc_stderr\": 0.004941887610849033,\n \"acc_norm\": 0.7554272057359092,\n\
\ \"acc_norm_stderr\": 0.004289551633772027\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3851851851851852,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.3851851851851852,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.46710526315789475,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.46710526315789475,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.49056603773584906,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.49056603773584906,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.046482319871173156,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.046482319871173156\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.04514496132873633,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.04514496132873633\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.04161808503501528,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.04161808503501528\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376896,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376896\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795133,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795133\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5064516129032258,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.5064516129032258,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5707070707070707,\n \"acc_stderr\": 0.035265527246011986,\n \"\
acc_norm\": 0.5707070707070707,\n \"acc_norm_stderr\": 0.035265527246011986\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6113989637305699,\n \"acc_stderr\": 0.03517739796373132,\n\
\ \"acc_norm\": 0.6113989637305699,\n \"acc_norm_stderr\": 0.03517739796373132\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3717948717948718,\n \"acc_stderr\": 0.024503472557110932,\n\
\ \"acc_norm\": 0.3717948717948718,\n \"acc_norm_stderr\": 0.024503472557110932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"\
acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828978,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828978\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6127450980392157,\n \"acc_stderr\": 0.034189312338333444,\n \"\
acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.034189312338333444\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6286919831223629,\n \"acc_stderr\": 0.03145068600744859,\n \
\ \"acc_norm\": 0.6286919831223629,\n \"acc_norm_stderr\": 0.03145068600744859\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.452914798206278,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.452914798206278,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4961832061068702,\n \"acc_stderr\": 0.043851623256015534,\n\
\ \"acc_norm\": 0.4961832061068702,\n \"acc_norm_stderr\": 0.043851623256015534\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212093,\n \"\
acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212093\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190192,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5951468710089399,\n\
\ \"acc_stderr\": 0.01755324646772026,\n \"acc_norm\": 0.5951468710089399,\n\
\ \"acc_norm_stderr\": 0.01755324646772026\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5115606936416185,\n \"acc_stderr\": 0.02691189868637792,\n\
\ \"acc_norm\": 0.5115606936416185,\n \"acc_norm_stderr\": 0.02691189868637792\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.028614624752805407,\n\
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.028614624752805407\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5144694533762058,\n\
\ \"acc_stderr\": 0.02838619808417768,\n \"acc_norm\": 0.5144694533762058,\n\
\ \"acc_norm_stderr\": 0.02838619808417768\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.49074074074074076,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34615384615384615,\n\
\ \"acc_stderr\": 0.012150699768228565,\n \"acc_norm\": 0.34615384615384615,\n\
\ \"acc_norm_stderr\": 0.012150699768228565\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27205882352941174,\n \"acc_stderr\": 0.02703304115168146,\n\
\ \"acc_norm\": 0.27205882352941174,\n \"acc_norm_stderr\": 0.02703304115168146\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4166666666666667,\n \"acc_stderr\": 0.01994491413687358,\n \
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.01994491413687358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5959183673469388,\n \"acc_stderr\": 0.031414708025865885,\n\
\ \"acc_norm\": 0.5959183673469388,\n \"acc_norm_stderr\": 0.031414708025865885\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6198830409356725,\n \"acc_stderr\": 0.037229657413855394,\n\
\ \"acc_norm\": 0.6198830409356725,\n \"acc_norm_stderr\": 0.037229657413855394\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.3966399813178778,\n\
\ \"mc2_stderr\": 0.015001622827420584\n }\n}\n```"
repo_url: https://huggingface.co/llm-agents/tora-code-34b-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T19-58-46.874384.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-58-46.874384.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T19-58-46.874384.parquet'
- config_name: results
data_files:
- split: 2023_10_10T19_58_46.874384
path:
- results_2023-10-10T19-58-46.874384.parquet
- split: latest
path:
- results_2023-10-10T19-58-46.874384.parquet
---
# Dataset Card for Evaluation run of llm-agents/tora-code-34b-v1.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/llm-agents/tora-code-34b-v1.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [llm-agents/tora-code-34b-v1.0](https://huggingface.co/llm-agents/tora-code-34b-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T19:58:46.874384](https://huggingface.co/datasets/open-llm-leaderboard/details_llm-agents__tora-code-34b-v1.0/blob/main/results_2023-10-10T19-58-46.874384.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46960448692665785,
"acc_stderr": 0.03519478895140827,
"acc_norm": 0.4732830325230917,
"acc_norm_stderr": 0.03518407016477708,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.3966399813178778,
"mc2_stderr": 0.015001622827420584
},
"harness|arc:challenge|25": {
"acc": 0.4735494880546075,
"acc_stderr": 0.014590931358120174,
"acc_norm": 0.5042662116040956,
"acc_norm_stderr": 0.014610858923956952
},
"harness|hellaswag|10": {
"acc": 0.5691097390957977,
"acc_stderr": 0.004941887610849033,
"acc_norm": 0.7554272057359092,
"acc_norm_stderr": 0.004289551633772027
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.46710526315789475,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.46710526315789475,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.49056603773584906,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.49056603773584906,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.046482319871173156,
"acc_norm": 0.69,
"acc_norm_stderr": 0.046482319871173156
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.04514496132873633,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.04514496132873633
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.04161808503501528,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.04161808503501528
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376896,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795133,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795133
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5707070707070707,
"acc_stderr": 0.035265527246011986,
"acc_norm": 0.5707070707070707,
"acc_norm_stderr": 0.035265527246011986
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6113989637305699,
"acc_stderr": 0.03517739796373132,
"acc_norm": 0.6113989637305699,
"acc_norm_stderr": 0.03517739796373132
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3717948717948718,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.3717948717948718,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828978,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828978
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.034189312338333444,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.034189312338333444
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6286919831223629,
"acc_stderr": 0.03145068600744859,
"acc_norm": 0.6286919831223629,
"acc_norm_stderr": 0.03145068600744859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.452914798206278,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.452914798206278,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4961832061068702,
"acc_stderr": 0.043851623256015534,
"acc_norm": 0.4961832061068702,
"acc_norm_stderr": 0.043851623256015534
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212093,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212093
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190192,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5951468710089399,
"acc_stderr": 0.01755324646772026,
"acc_norm": 0.5951468710089399,
"acc_norm_stderr": 0.01755324646772026
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5115606936416185,
"acc_stderr": 0.02691189868637792,
"acc_norm": 0.5115606936416185,
"acc_norm_stderr": 0.02691189868637792
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805407,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805407
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5144694533762058,
"acc_stderr": 0.02838619808417768,
"acc_norm": 0.5144694533762058,
"acc_norm_stderr": 0.02838619808417768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.012150699768228565,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.012150699768228565
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27205882352941174,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.27205882352941174,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5959183673469388,
"acc_stderr": 0.031414708025865885,
"acc_norm": 0.5959183673469388,
"acc_norm_stderr": 0.031414708025865885
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6198830409356725,
"acc_stderr": 0.037229657413855394,
"acc_norm": 0.6198830409356725,
"acc_norm_stderr": 0.037229657413855394
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.3966399813178778,
"mc2_stderr": 0.015001622827420584
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
si3mshady/aws_whitepapers | 2023-10-10T20:15:26.000Z | [
"region:us"
] | si3mshady | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b | 2023-10-10T20:15:44.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 1 | 0 | ---
pretty_name: Evaluation run of Delcos/Mistral-Pygmalion-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Delcos/Mistral-Pygmalion-7b](https://huggingface.co/Delcos/Mistral-Pygmalion-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T20:14:17.715432](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b/blob/main/results_2023-10-10T20-14-17.715432.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4940842782473963,\n\
\ \"acc_stderr\": 0.03510437959075512,\n \"acc_norm\": 0.4981695151157412,\n\
\ \"acc_norm_stderr\": 0.03508961643892265,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.41821115723032093,\n\
\ \"mc2_stderr\": 0.013974820403469736\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5017064846416383,\n \"acc_stderr\": 0.014611305705056987,\n\
\ \"acc_norm\": 0.5443686006825939,\n \"acc_norm_stderr\": 0.014553749939306863\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5864369647480582,\n\
\ \"acc_stderr\": 0.004914655063329499,\n \"acc_norm\": 0.7848038239394542,\n\
\ \"acc_norm_stderr\": 0.00410118487096418\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5169811320754717,\n \"acc_stderr\": 0.030755120364119905,\n\
\ \"acc_norm\": 0.5169811320754717,\n \"acc_norm_stderr\": 0.030755120364119905\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04181210050035455,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04181210050035455\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\":\
\ 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.036186648199362466,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.036186648199362466\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715563,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715563\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523853,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523853\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471255,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471255\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5290322580645161,\n\
\ \"acc_stderr\": 0.028396016402761,\n \"acc_norm\": 0.5290322580645161,\n\
\ \"acc_norm_stderr\": 0.028396016402761\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.037937131711656344,\n\
\ \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.037937131711656344\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5959595959595959,\n \"acc_stderr\": 0.03496130972056128,\n \"\
acc_norm\": 0.5959595959595959,\n \"acc_norm_stderr\": 0.03496130972056128\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.02529460802398647,\n \
\ \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.02529460802398647\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6605504587155964,\n\
\ \"acc_stderr\": 0.02030210934266235,\n \"acc_norm\": 0.6605504587155964,\n\
\ \"acc_norm_stderr\": 0.02030210934266235\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3611111111111111,\n \"acc_stderr\": 0.03275773486100999,\n\
\ \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.03275773486100999\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6715686274509803,\n \"acc_stderr\": 0.032962451101722294,\n \"\
acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.032962451101722294\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.0331883328621728,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.0331883328621728\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870255,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870255\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6198347107438017,\n \"acc_stderr\": 0.04431324501968431,\n \"\
acc_norm\": 0.6198347107438017,\n \"acc_norm_stderr\": 0.04431324501968431\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.558282208588957,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.558282208588957,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7136752136752137,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.7136752136752137,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6819923371647509,\n\
\ \"acc_stderr\": 0.01665348627561539,\n \"acc_norm\": 0.6819923371647509,\n\
\ \"acc_norm_stderr\": 0.01665348627561539\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5346820809248555,\n \"acc_stderr\": 0.02685425792825888,\n\
\ \"acc_norm\": 0.5346820809248555,\n \"acc_norm_stderr\": 0.02685425792825888\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2435754189944134,\n\
\ \"acc_stderr\": 0.014355911964767867,\n \"acc_norm\": 0.2435754189944134,\n\
\ \"acc_norm_stderr\": 0.014355911964767867\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n\
\ \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.027513925683549434,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.027513925683549434\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5246913580246914,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.5246913580246914,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3829787234042553,\n \"acc_stderr\": 0.02899908090480618,\n \
\ \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.02899908090480618\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.36962190352020863,\n\
\ \"acc_stderr\": 0.012328445778575253,\n \"acc_norm\": 0.36962190352020863,\n\
\ \"acc_norm_stderr\": 0.012328445778575253\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4820261437908497,\n \"acc_stderr\": 0.020214761037872408,\n \
\ \"acc_norm\": 0.4820261437908497,\n \"acc_norm_stderr\": 0.020214761037872408\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5510204081632653,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.5510204081632653,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.01568092936402465,\n \"mc2\": 0.41821115723032093,\n\
\ \"mc2_stderr\": 0.013974820403469736\n }\n}\n```"
repo_url: https://huggingface.co/Delcos/Mistral-Pygmalion-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|arc:challenge|25_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hellaswag|10_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T20-14-17.715432.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T20-14-17.715432.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T20-14-17.715432.parquet'
- config_name: results
data_files:
- split: 2023_10_10T20_14_17.715432
path:
- results_2023-10-10T20-14-17.715432.parquet
- split: latest
path:
- results_2023-10-10T20-14-17.715432.parquet
---
# Dataset Card for Evaluation run of Delcos/Mistral-Pygmalion-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Delcos/Mistral-Pygmalion-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Delcos/Mistral-Pygmalion-7b](https://huggingface.co/Delcos/Mistral-Pygmalion-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T20:14:17.715432](https://huggingface.co/datasets/open-llm-leaderboard/details_Delcos__Mistral-Pygmalion-7b/blob/main/results_2023-10-10T20-14-17.715432.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4940842782473963,
"acc_stderr": 0.03510437959075512,
"acc_norm": 0.4981695151157412,
"acc_norm_stderr": 0.03508961643892265,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.41821115723032093,
"mc2_stderr": 0.013974820403469736
},
"harness|arc:challenge|25": {
"acc": 0.5017064846416383,
"acc_stderr": 0.014611305705056987,
"acc_norm": 0.5443686006825939,
"acc_norm_stderr": 0.014553749939306863
},
"harness|hellaswag|10": {
"acc": 0.5864369647480582,
"acc_stderr": 0.004914655063329499,
"acc_norm": 0.7848038239394542,
"acc_norm_stderr": 0.00410118487096418
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5169811320754717,
"acc_stderr": 0.030755120364119905,
"acc_norm": 0.5169811320754717,
"acc_norm_stderr": 0.030755120364119905
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5,
"acc_stderr": 0.04181210050035455,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04181210050035455
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.036186648199362466,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.036186648199362466
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715563,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715563
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523853,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523853
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471255,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471255
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5290322580645161,
"acc_stderr": 0.028396016402761,
"acc_norm": 0.5290322580645161,
"acc_norm_stderr": 0.028396016402761
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.037937131711656344,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.037937131711656344
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5959595959595959,
"acc_stderr": 0.03496130972056128,
"acc_norm": 0.5959595959595959,
"acc_norm_stderr": 0.03496130972056128
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.02529460802398647,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.02529460802398647
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6605504587155964,
"acc_stderr": 0.02030210934266235,
"acc_norm": 0.6605504587155964,
"acc_norm_stderr": 0.02030210934266235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.03275773486100999,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.03275773486100999
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.032962451101722294,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.032962451101722294
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.0331883328621728,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.0331883328621728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870255,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870255
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6198347107438017,
"acc_stderr": 0.04431324501968431,
"acc_norm": 0.6198347107438017,
"acc_norm_stderr": 0.04431324501968431
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.558282208588957,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.558282208588957,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7136752136752137,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.7136752136752137,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6819923371647509,
"acc_stderr": 0.01665348627561539,
"acc_norm": 0.6819923371647509,
"acc_norm_stderr": 0.01665348627561539
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5346820809248555,
"acc_stderr": 0.02685425792825888,
"acc_norm": 0.5346820809248555,
"acc_norm_stderr": 0.02685425792825888
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2435754189944134,
"acc_stderr": 0.014355911964767867,
"acc_norm": 0.2435754189944134,
"acc_norm_stderr": 0.014355911964767867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.027513925683549434,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.027513925683549434
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5246913580246914,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.5246913580246914,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.36962190352020863,
"acc_stderr": 0.012328445778575253,
"acc_norm": 0.36962190352020863,
"acc_norm_stderr": 0.012328445778575253
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4820261437908497,
"acc_stderr": 0.020214761037872408,
"acc_norm": 0.4820261437908497,
"acc_norm_stderr": 0.020214761037872408
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5510204081632653,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.5510204081632653,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.01568092936402465,
"mc2": 0.41821115723032093,
"mc2_stderr": 0.013974820403469736
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
HuggingSander/todo | 2023-10-10T20:21:23.000Z | [
"region:us"
] | HuggingSander | null | null | null | 0 | 0 | Entry not found |
johnymoreira/squad-ptbr | 2023-10-10T20:21:38.000Z | [
"region:us"
] | johnymoreira | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 177842701
num_examples: 174420
- name: dev
num_bytes: 58706149
num_examples: 55028
download_size: 42266665
dataset_size: 236548850
---
# Dataset Card for "squad-ptbr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ka4on/ultrasound | 2023-10-10T20:39:25.000Z | [
"region:us"
] | Ka4on | null | null | null | 0 | 0 | this is the dataset for fine tuning LLM on radiology reports (ultrasound) |
PeacefulData/HyPoradise-v1-GigaSpeech-Entertainment | 2023-10-10T22:28:24.000Z | [
"task_categories:text-generation",
"language_creators:expert-generated",
"size_categories:1k<n<10M",
"license:mit",
"code",
"Whisper-tiny",
"region:us"
] | PeacefulData | null | null | null | 1 | 0 | ---
license: mit
language_creators:
- expert-generated
task_categories:
- text-generation
tags:
- code
- Whisper-tiny
pretty_name: Whispering LLaLMA for new Hypotheses Paradise Subset
size_categories:
- 1k<n<10M
---
|
xFrisky02/artur | 2023-10-10T21:15:19.000Z | [
"region:us"
] | xFrisky02 | null | null | null | 0 | 0 | Entry not found |
codecomplete/base_dataset | 2023-10-10T20:53:14.000Z | [
"region:us"
] | codecomplete | null | null | null | 0 | 0 | Entry not found |
sordonia/my-wiki-latex_mmlu_from_valid_all | 2023-10-11T01:19:27.000Z | [
"region:us"
] | sordonia | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: subject
dtype: string
- name: docno
dtype: int64
- name: score
dtype: float64
- name: dfq
dtype: int64
- name: text
dtype: string
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: revid
dtype: string
splits:
- name: train
num_bytes: 1139620543
num_examples: 137881
download_size: 0
dataset_size: 1139620543
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "my-wiki-latex_mmlu_from_valid_all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
stealthwriter/humanAI5600Convert | 2023-10-10T20:59:00.000Z | [
"region:us"
] | stealthwriter | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1303230
num_examples: 4849
- name: validation
num_bytes: 143929
num_examples: 539
download_size: 560202
dataset_size: 1447159
---
# Dataset Card for "humanAI5600Convert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BubbleJoe/snli_unified_input | 2023-10-11T00:06:53.000Z | [
"region:us"
] | BubbleJoe | null | null | null | 1 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: input
dtype: string
splits:
- name: test
num_bytes: 2617808
num_examples: 10000
- name: train
num_bytes: 137270292
num_examples: 550152
- name: validation
num_bytes: 2626072
num_examples: 10000
download_size: 40318460
dataset_size: 142514172
---
# Dataset Card for "snli_unified_input"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
itsjacksimon/printersoon | 2023-10-10T21:16:03.000Z | [
"task_categories:image-to-text",
"region:us"
] | itsjacksimon | null | null | null | 0 | 0 | ---
task_categories:
- image-to-text
--- |
nadsoft/ara-sample | 2023-10-10T21:21:30.000Z | [
"region:us"
] | nadsoft | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 16610581.0
num_examples: 108
download_size: 15605783
dataset_size: 16610581.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# test |
Celiroad/celiroad | 2023-10-10T21:17:34.000Z | [
"task_categories:text-classification",
"size_categories:n<1K",
"language:en",
"license:afl-3.0",
"diego lincoln",
"celi road",
"celi",
"region:us"
] | Celiroad | null | null | null | 0 | 0 | ---
license: afl-3.0
task_categories:
- text-classification
language:
- en
tags:
- diego lincoln
- celi road
- celi
size_categories:
- n<1K
--- |
ostapeno/old_wiki_SUB_10_sps01_generated_platypus_icl5_answers_iter1 | 2023-10-10T21:21:19.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: subject
dtype: string
- name: response
dtype: string
- name: author_instr
dtype: string
- name: inst_index_for_context
dtype: 'null'
- name: author_response
dtype: string
- name: normalized_cumul_logprob_response
dtype: float64
splits:
- name: formal_logic
num_bytes: 911674.0191603875
num_examples: 249
- name: machine_learning
num_bytes: 1380325.7237890204
num_examples: 377
- name: global_facts
num_bytes: 1501149.9913885898
num_examples: 410
- name: abstract_algebra
num_bytes: 838447.1903121637
num_examples: 229
- name: high_school_physics
num_bytes: 2237079.62131324
num_examples: 611
- name: college_biology
num_bytes: 1592683.5274488698
num_examples: 435
- name: high_school_government_and_politics
num_bytes: 1794057.3067814854
num_examples: 490
- name: prehistory
num_bytes: 2500696.205166846
num_examples: 683
- name: security_studies
num_bytes: 2295661.0843918193
num_examples: 627
- name: sociology
num_bytes: 1955156.330247578
num_examples: 534
download_size: 8911647
dataset_size: 17006931.0
---
# Dataset Card for "old_wiki_SUB_10_sps01_generated_platypus_icl5_answers_iter1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cannlytics/cannabis_analytes | 2023-10-10T23:20:30.000Z | [
"license:cc-by-4.0",
"region:us"
] | cannlytics | This dataset consists of analyte data for various analytes that are regularly tested for in cannabis. The dataset consists of sub-datasets for each type of test, as well as a sub-dataset that includes all analytes. | @inproceedings{cannlytics2023cannabis_analytes,
author = {Skeate, Keegan and O'Sullivan-Sutherland, Candace},
title = {Cannabis Analytes},
booktitle = {Cannabis Data Science},
month = {October},
year = {2023},
address = {United States of America},
publisher = {Cannlytics}
} | null | 1 | 0 | ---
pretty_name: cannabis_analytes
license:
- cc-by-4.0
---
# Cannabis Analytes
This dataset consists of analyte data for various analytes that are regularly tested for in cannabis. The dataset consists of sub-datasets for each type of test, as well as a sub-dataset that includes all analytes.
## Dataset Structure
The dataset is partitioned into 18 subsets for each state and the aggregate.
| State | Code | Status |
| [All](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/analytes.json) | `all` | ✅ |
| [Cannabinoids](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/cannabinoids.json) | `cannabinoids` | ✅ |
| [Terpenes](https://huggingface.co/datasets/cannlytics/cannabis_licenses/tree/main/data/terpenes.json) | `terpenes` | ✅ |
| Pesticides | `pesticides` | ⏳ Coming soon |
| Microbes | `microbes` | ⏳ Coming soon |
| Heavy metals | `heavy_metals` | ⏳ Coming soon |
| Residual solvents | `residual_solvents` | ⏳ Coming soon |
| Other | `other` | ⏳ Coming soon |
## Using the Dataset
You can load all the analytes, or the analytes for a specific test. For example:
```py
from datasets import load_dataset
# Get all of the analytes
dataset = load_dataset('cannlytics/cannabis_licenses', 'all')
analytes = dataset['data']
# Get the cannabinoids.
dataset = load_dataset('cannlytics/cannabis_licenses', 'cannabinoids')
terpenes = dataset['data']
# Get the terpenes.
dataset = load_dataset('cannlytics/cannabis_licenses', 'terpenes')
terpenes = dataset['data']
```
## Data Fields
Below is a non-exhaustive list of fields, used to standardize the various data that are encountered, that you may expect to find for each observation.
## Data Fields
Below is a non-exhaustive list of fields used to standardize the various data that are encountered. You may expect to find the following for each observation:
| Field | Example | Description |
|------------------------------|----------------------------------------------|------------------------------------------------------------------------------------------------------|
| `key` | `"thca"` | A unique ID for each analyte. |
| `description` | `"Δ-9-Tetrahydrocannabinol is a cannabinoid..."` | A brief description or summary about the analyte. |
| `name` | `"THC"` | Common name of the analyte. |
| `scientific_name` | `"\u0394-9-Tetrahydrocannabinol"` | The scientific name or IUPAC name of the analyte. |
| `type` | `"cannabinoid"` | The type or classification of the analyte (e.g., terpene, cannabinoid). |
| `wikipedia_url` | `"https://en.wikipedia.org/wiki/Tetrahydrocannabinol"` | The Wikipedia URL where more detailed information can be found about the analyte. |
| `degrades_to` | `["cannabinol"]` | A list of chemicals or substances the analyte degrades to. |
| `precursors` | `["thca"]` | A list of precursor chemicals or substances related to the analyte. |
| `subtype` | `"psychoactive"` | A sub-classification or additional details about the type of the analyte. |
| `cas_number` | `"1972-08-3"` | The Chemical Abstracts Service (CAS) registry number, which is a unique identifier for chemical substances.|
| `chemical_formula` | `"C21H30O2"` | The chemical formula of the analyte. |
| `molar_mass` | `"314.5 g/mol"` | The molar mass of the analyte. |
| `density` | `"1.0±0.1 g/cm3"` | The density of the analyte. |
| `boiling_point` | `"383.5±42.0 °C"` | The boiling point of the analyte. |
| `image_url` | `"https://example.com/image.jpg"` | URL of an image representing the analyte. |
| `chemical_formula_image_url` | `"https://example.com/formula_image.jpg"` | URL of an image representing the chemical formula of the analyte. |
## Data Splits
The data is split into subsets by analysis. You can retrieve all analytes by requesting the `all` subset.
```py
from datasets import load_dataset
# Get all cannabis licenses.
dataset = load_dataset('cannlytics/cannabis_licenses', 'all')
data = dataset['data']
```
## Curation Rationale
This dataset provides a standard set of analyte data for [cannabis tests](https://huggingface.co/datasets/cannlytics/cannabis_tests).
## Data Collection and Normalization
The `get_cannabis_analytes.py` routine is used to normalize values collected from Wikipedia.
## Known Limitations
The datasets are not complete and may include inaccurate information.
## Dataset Curators
Curated by [🔥Cannlytics](https://cannlytics.com)<br>
<contact@cannlytics.com>
## License
```
Copyright (c) 2023 Cannlytics
The files associated with this dataset are licensed under a
Creative Commons Attribution 4.0 International license.
You can share, copy and modify this dataset so long as you give
appropriate credit, provide a link to the CC BY license, and
indicate if changes were made, but you may not do so in a way
that suggests the rights holder has endorsed you or your use of
the dataset. Note that further permission may be required for
any content within the dataset that is identified as belonging
to a third party.
```
## Contributions
Thanks to [🔥Cannlytics](https://cannlytics.com), [@candy-o](https://github.com/candy-o), [@keeganskeate](https://github.com/keeganskeate), and the entire [Cannabis Data Science Team](https://meetup.com/cannabis-data-science/members) for their contributions.
|
Lollitor/SMILES1M | 2023-10-10T21:43:15.000Z | [
"region:us"
] | Lollitor | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 109901380
num_examples: 1000000
download_size: 43437052
dataset_size: 109901380
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "SMILES1M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osbm/unet-explainer-data | 2023-10-10T21:49:31.000Z | [
"region:us"
] | osbm | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B | 2023-10-10T21:38:50.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-LLaMA-1.3B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T21:37:25.489785](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B/blob/main/results_2023-10-10T21-37-25.489785.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26098523540636526,\n\
\ \"acc_stderr\": 0.03182266156338253,\n \"acc_norm\": 0.2642292434758114,\n\
\ \"acc_norm_stderr\": 0.03182862029801721,\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427683,\n \"mc2\": 0.3714304497051817,\n\
\ \"mc2_stderr\": 0.013675407405437916\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2909556313993174,\n \"acc_stderr\": 0.013273077865907588,\n\
\ \"acc_norm\": 0.32849829351535836,\n \"acc_norm_stderr\": 0.013724978465537368\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4552877912766381,\n\
\ \"acc_stderr\": 0.004969790407117543,\n \"acc_norm\": 0.6091416052579167,\n\
\ \"acc_norm_stderr\": 0.00486945515093381\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749912,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749912\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33191489361702126,\n \"acc_stderr\": 0.03078373675774565,\n\
\ \"acc_norm\": 0.33191489361702126,\n \"acc_norm_stderr\": 0.03078373675774565\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24338624338624337,\n \"acc_stderr\": 0.02210112878741542,\n \"\
acc_norm\": 0.24338624338624337,\n \"acc_norm_stderr\": 0.02210112878741542\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.033333333333333375,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.033333333333333375\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.24838709677419354,\n \"acc_stderr\": 0.024580028921481,\n \"acc_norm\"\
: 0.24838709677419354,\n \"acc_norm_stderr\": 0.024580028921481\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n\
\ \"acc_stderr\": 0.03225799476233485,\n \"acc_norm\": 0.30049261083743845,\n\
\ \"acc_norm_stderr\": 0.03225799476233485\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"\
acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180362,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180362\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204416,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204416\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.02755361446786381,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.02755361446786381\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790215,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790215\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.21296296296296297,\n \"acc_stderr\": 0.02792096314799366,\n \"\
acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.02792096314799366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.27941176470588236,\n \"acc_stderr\": 0.031493281045079556,\n \"\
acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.031493281045079556\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.27848101265822783,\n \"acc_stderr\": 0.02917868230484256,\n \
\ \"acc_norm\": 0.27848101265822783,\n \"acc_norm_stderr\": 0.02917868230484256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.0432076780753667,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.0432076780753667\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.26785714285714285,\n\
\ \"acc_stderr\": 0.04203277291467764,\n \"acc_norm\": 0.26785714285714285,\n\
\ \"acc_norm_stderr\": 0.04203277291467764\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283164,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283164\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.31196581196581197,\n\
\ \"acc_stderr\": 0.03035152732334496,\n \"acc_norm\": 0.31196581196581197,\n\
\ \"acc_norm_stderr\": 0.03035152732334496\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2720306513409962,\n\
\ \"acc_stderr\": 0.015913367447500517,\n \"acc_norm\": 0.2720306513409962,\n\
\ \"acc_norm_stderr\": 0.015913367447500517\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410626,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410626\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140245,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140245\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26140808344198174,\n\
\ \"acc_stderr\": 0.011222528169771314,\n \"acc_norm\": 0.26140808344198174,\n\
\ \"acc_norm_stderr\": 0.011222528169771314\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1801470588235294,\n \"acc_stderr\": 0.02334516361654485,\n\
\ \"acc_norm\": 0.1801470588235294,\n \"acc_norm_stderr\": 0.02334516361654485\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.27941176470588236,\n \"acc_stderr\": 0.018152871051538812,\n \
\ \"acc_norm\": 0.27941176470588236,\n \"acc_norm_stderr\": 0.018152871051538812\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3090909090909091,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.3090909090909091,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.027212835884073142,\n\
\ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.027212835884073142\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25301204819277107,\n\
\ \"acc_stderr\": 0.03384429155233135,\n \"acc_norm\": 0.25301204819277107,\n\
\ \"acc_norm_stderr\": 0.03384429155233135\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427683,\n \"mc2\": 0.3714304497051817,\n\
\ \"mc2_stderr\": 0.013675407405437916\n }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-37-25.489785.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-37-25.489785.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-37-25.489785.parquet'
- config_name: results
data_files:
- split: 2023_10_10T21_37_25.489785
path:
- results_2023-10-10T21-37-25.489785.parquet
- split: latest
path:
- results_2023-10-10T21-37-25.489785.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-1.3B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-1.3B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-1.3B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T21:37:25.489785](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-1.3B/blob/main/results_2023-10-10T21-37-25.489785.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26098523540636526,
"acc_stderr": 0.03182266156338253,
"acc_norm": 0.2642292434758114,
"acc_norm_stderr": 0.03182862029801721,
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427683,
"mc2": 0.3714304497051817,
"mc2_stderr": 0.013675407405437916
},
"harness|arc:challenge|25": {
"acc": 0.2909556313993174,
"acc_stderr": 0.013273077865907588,
"acc_norm": 0.32849829351535836,
"acc_norm_stderr": 0.013724978465537368
},
"harness|hellaswag|10": {
"acc": 0.4552877912766381,
"acc_stderr": 0.004969790407117543,
"acc_norm": 0.6091416052579167,
"acc_norm_stderr": 0.00486945515093381
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749912,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749912
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33191489361702126,
"acc_stderr": 0.03078373675774565,
"acc_norm": 0.33191489361702126,
"acc_norm_stderr": 0.03078373675774565
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24338624338624337,
"acc_stderr": 0.02210112878741542,
"acc_norm": 0.24338624338624337,
"acc_norm_stderr": 0.02210112878741542
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.033333333333333375,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.033333333333333375
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.024580028921481,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.024580028921481
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233485,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180362,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180362
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204416,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204416
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.02755361446786381,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.02755361446786381
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790215,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790215
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.02792096314799366,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.02792096314799366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.27848101265822783,
"acc_stderr": 0.02917868230484256,
"acc_norm": 0.27848101265822783,
"acc_norm_stderr": 0.02917868230484256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.0432076780753667,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.0432076780753667
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.26785714285714285,
"acc_stderr": 0.04203277291467764,
"acc_norm": 0.26785714285714285,
"acc_norm_stderr": 0.04203277291467764
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283164,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283164
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.31196581196581197,
"acc_stderr": 0.03035152732334496,
"acc_norm": 0.31196581196581197,
"acc_norm_stderr": 0.03035152732334496
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2720306513409962,
"acc_stderr": 0.015913367447500517,
"acc_norm": 0.2720306513409962,
"acc_norm_stderr": 0.015913367447500517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.02298959254312357,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.02298959254312357
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410626,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410626
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140245,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140245
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26140808344198174,
"acc_stderr": 0.011222528169771314,
"acc_norm": 0.26140808344198174,
"acc_norm_stderr": 0.011222528169771314
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1801470588235294,
"acc_stderr": 0.02334516361654485,
"acc_norm": 0.1801470588235294,
"acc_norm_stderr": 0.02334516361654485
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27941176470588236,
"acc_stderr": 0.018152871051538812,
"acc_norm": 0.27941176470588236,
"acc_norm_stderr": 0.018152871051538812
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3090909090909091,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.3090909090909091,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23673469387755103,
"acc_stderr": 0.027212835884073142,
"acc_norm": 0.23673469387755103,
"acc_norm_stderr": 0.027212835884073142
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25301204819277107,
"acc_stderr": 0.03384429155233135,
"acc_norm": 0.25301204819277107,
"acc_norm_stderr": 0.03384429155233135
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427683,
"mc2": 0.3714304497051817,
"mc2_stderr": 0.013675407405437916
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B | 2023-10-10T21:44:07.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [princeton-nlp/Sheared-LLaMA-2.7B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T21:42:42.589642](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B/blob/main/results_2023-10-10T21-42-42.589642.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27538221418700665,\n\
\ \"acc_stderr\": 0.03229722001912762,\n \"acc_norm\": 0.2792022693364169,\n\
\ \"acc_norm_stderr\": 0.03229300434645575,\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.37319738604899816,\n\
\ \"mc2_stderr\": 0.013647292180478764\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3822525597269625,\n \"acc_stderr\": 0.014200454049979282,\n\
\ \"acc_norm\": 0.41723549488054607,\n \"acc_norm_stderr\": 0.014409825518403084\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.519717187811193,\n\
\ \"acc_stderr\": 0.004985900172317697,\n \"acc_norm\": 0.7101175064728141,\n\
\ \"acc_norm_stderr\": 0.004527804016253782\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.04024778401977109,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.04024778401977109\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708094,\n\
\ \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708094\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.029513196625539355,\n\
\ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.029513196625539355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322004,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322004\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.033954900208561095,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.033954900208561095\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.267741935483871,\n \"acc_stderr\": 0.025189006660212378,\n \"\
acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212378\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"\
acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.031911782267135466,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.031911782267135466\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3160621761658031,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.3160621761658031,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.02213908110397154,\n \
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.02213908110397154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.2605042016806723,\n \"acc_stderr\": 0.028510251512341933,\n\
\ \"acc_norm\": 0.2605042016806723,\n \"acc_norm_stderr\": 0.028510251512341933\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24954128440366974,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.24954128440366974,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.26851851851851855,\n \"acc_stderr\": 0.030225226160012404,\n \"\
acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.030225226160012404\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501936,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501936\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3206751054852321,\n \"acc_stderr\": 0.030381931949990407,\n \
\ \"acc_norm\": 0.3206751054852321,\n \"acc_norm_stderr\": 0.030381931949990407\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.33884297520661155,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.33884297520661155,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914404,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n\
\ \"acc_stderr\": 0.016225017944770954,\n \"acc_norm\": 0.28991060025542786,\n\
\ \"acc_norm_stderr\": 0.016225017944770954\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010073,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010073\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3215434083601286,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.3215434083601286,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590617,\n \
\ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590617\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26792698826597133,\n\
\ \"acc_stderr\": 0.011311347690633881,\n \"acc_norm\": 0.26792698826597133,\n\
\ \"acc_norm_stderr\": 0.011311347690633881\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.27205882352941174,\n \"acc_stderr\": 0.02703304115168146,\n\
\ \"acc_norm\": 0.27205882352941174,\n \"acc_norm_stderr\": 0.02703304115168146\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24019607843137256,\n \"acc_stderr\": 0.01728276069516742,\n \
\ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.01728276069516742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.39090909090909093,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.39090909090909093,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.024789071332007643,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.024789071332007643\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.26865671641791045,\n\
\ \"acc_stderr\": 0.03134328358208955,\n \"acc_norm\": 0.26865671641791045,\n\
\ \"acc_norm_stderr\": 0.03134328358208955\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3072289156626506,\n\
\ \"acc_stderr\": 0.03591566797824662,\n \"acc_norm\": 0.3072289156626506,\n\
\ \"acc_norm_stderr\": 0.03591566797824662\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n\
\ \"mc1_stderr\": 0.015077219200662594,\n \"mc2\": 0.37319738604899816,\n\
\ \"mc2_stderr\": 0.013647292180478764\n }\n}\n```"
repo_url: https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T21-42-42.589642.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-42-42.589642.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T21-42-42.589642.parquet'
- config_name: results
data_files:
- split: 2023_10_10T21_42_42.589642
path:
- results_2023-10-10T21-42-42.589642.parquet
- split: latest
path:
- results_2023-10-10T21-42-42.589642.parquet
---
# Dataset Card for Evaluation run of princeton-nlp/Sheared-LLaMA-2.7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [princeton-nlp/Sheared-LLaMA-2.7B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T21:42:42.589642](https://huggingface.co/datasets/open-llm-leaderboard/details_princeton-nlp__Sheared-LLaMA-2.7B/blob/main/results_2023-10-10T21-42-42.589642.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27538221418700665,
"acc_stderr": 0.03229722001912762,
"acc_norm": 0.2792022693364169,
"acc_norm_stderr": 0.03229300434645575,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.37319738604899816,
"mc2_stderr": 0.013647292180478764
},
"harness|arc:challenge|25": {
"acc": 0.3822525597269625,
"acc_stderr": 0.014200454049979282,
"acc_norm": 0.41723549488054607,
"acc_norm_stderr": 0.014409825518403084
},
"harness|hellaswag|10": {
"acc": 0.519717187811193,
"acc_stderr": 0.004985900172317697,
"acc_norm": 0.7101175064728141,
"acc_norm_stderr": 0.004527804016253782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.04024778401977109,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.04024778401977109
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708094,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708094
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2851063829787234,
"acc_stderr": 0.029513196625539355,
"acc_norm": 0.2851063829787234,
"acc_norm_stderr": 0.029513196625539355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322004,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322004
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.033954900208561095,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.033954900208561095
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212378,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694433,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694433
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3160621761658031,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.3160621761658031,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.02213908110397154,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.02213908110397154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2605042016806723,
"acc_stderr": 0.028510251512341933,
"acc_norm": 0.2605042016806723,
"acc_norm_stderr": 0.028510251512341933
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24954128440366974,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.24954128440366974,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.030225226160012404,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.030225226160012404
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501936,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501936
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3206751054852321,
"acc_stderr": 0.030381931949990407,
"acc_norm": 0.3206751054852321,
"acc_norm_stderr": 0.030381931949990407
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.33884297520661155,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.33884297520661155,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914404,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770954,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770954
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010073,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010073
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3215434083601286,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.3215434083601286,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2695035460992908,
"acc_stderr": 0.026469036818590617,
"acc_norm": 0.2695035460992908,
"acc_norm_stderr": 0.026469036818590617
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26792698826597133,
"acc_stderr": 0.011311347690633881,
"acc_norm": 0.26792698826597133,
"acc_norm_stderr": 0.011311347690633881
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.27205882352941174,
"acc_stderr": 0.02703304115168146,
"acc_norm": 0.27205882352941174,
"acc_norm_stderr": 0.02703304115168146
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.01728276069516742,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.01728276069516742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.39090909090909093,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.39090909090909093,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.024789071332007643,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.024789071332007643
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.26865671641791045,
"acc_stderr": 0.03134328358208955,
"acc_norm": 0.26865671641791045,
"acc_norm_stderr": 0.03134328358208955
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3072289156626506,
"acc_stderr": 0.03591566797824662,
"acc_norm": 0.3072289156626506,
"acc_norm_stderr": 0.03591566797824662
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.015077219200662594,
"mc2": 0.37319738604899816,
"mc2_stderr": 0.013647292180478764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yuchenlin/i-Mind2Web | 2023-10-10T23:54:54.000Z | [
"language:en",
"license:mit",
"region:us"
] | yuchenlin | null | null | null | 0 | 0 | ---
license: mit
language:
- en
configs:
- config_name: default
data_files:
- split: test_mini
path: "K=10/test_mini.json"
- split: test_all
path: "K=10/test_all.json"
- split: dev
path: "K=10/dev.json"
- split: dev_5
path: "K=10/K=5_dev.json"
- split: train
path: "K=10/train.json"
--- |
Soheil-FM/faq | 2023-10-10T22:04:14.000Z | [
"region:us"
] | Soheil-FM | null | null | null | 0 | 0 | Entry not found |
ostapeno/old_wiki_SUB_10_sps01_generated_platypus_icl5_answers_iter1_answers | 2023-10-10T22:08:47.000Z | [
"region:us"
] | ostapeno | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: subject
dtype: string
- name: response
dtype: string
- name: author_instr
dtype: string
- name: inst_index_for_context
dtype: 'null'
- name: author_response
dtype: string
- name: normalized_cumul_logprob_response
dtype: float64
splits:
- name: formal_logic
num_bytes: 844076.4602169982
num_examples: 228
- name: machine_learning
num_bytes: 1280923.0492766728
num_examples: 346
- name: global_facts
num_bytes: 1469729.6259041592
num_examples: 397
- name: abstract_algebra
num_bytes: 795949.2936256782
num_examples: 215
- name: high_school_physics
num_bytes: 2184232.9452983723
num_examples: 590
- name: college_biology
num_bytes: 1565983.9590867993
num_examples: 423
- name: high_school_government_and_politics
num_bytes: 1680748.7409584087
num_examples: 454
- name: prehistory
num_bytes: 2435975.0474683545
num_examples: 658
- name: security_studies
num_bytes: 2224955.9324141047
num_examples: 601
- name: sociology
num_bytes: 1895469.945750452
num_examples: 512
download_size: 8592037
dataset_size: 16378045.0
---
# Dataset Card for "old_wiki_SUB_10_sps01_generated_platypus_icl5_answers_iter1_answers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-acronym_identification-default-d4ab15-94341146060 | 2023-10-10T22:16:34.000Z | [
"region:us"
] | autoevaluate | null | null | null | 0 | 0 | Entry not found |
zeio/baneks-speech | 2023-10-11T00:04:19.000Z | [
"license:apache-2.0",
"region:us"
] | zeio | This dataset contains speech generated for anecdotes from the [baneks dataset](https://huggingface.co/datasets/zeio/baneks) | null | null | 0 | 0 | ---
license: apache-2.0
---
|
totallyrealaccount/Rvrgt | 2023-10-10T22:31:51.000Z | [
"region:us"
] | totallyrealaccount | null | null | null | 0 | 0 | Entry not found |
Fraol/TrainDedupedRefDatasetWMetricFinal3 | 2023-10-10T22:36:49.000Z | [
"region:us"
] | Fraol | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: source
dtype: string
- name: path_name
dtype: string
- name: file_name
dtype: string
- name: ref_type
dtype: string
- name: hash
dtype: string
- name: class_name
dtype: string
- name: method_name
dtype: string
- name: row_number
dtype: int64
- name: cbo
dtype: float64
- name: wmc
dtype: float64
- name: lcom*
dtype: float64
- name: loc
dtype: float64
- name: astc2
dtype: string
- name: source_after
dtype: string
- name: cbo_after
dtype: float64
- name: wmc_after
dtype: float64
- name: lcom*_after
dtype: float64
- name: loc_after
dtype: float64
- name: astc1
dtype: string
- name: issue_name
dtype: string
splits:
- name: train
num_bytes: 168485103
num_examples: 6000
- name: test
num_bytes: 41591279
num_examples: 1500
download_size: 47665229
dataset_size: 210076382
---
# Dataset Card for "TrainDedupedRefDatasetWMetricFinal3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rchan26/english_char_split | 2023-10-10T22:40:41.000Z | [
"region:us"
] | rchan26 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: word
dtype: string
- name: language
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: special_tokens_mask
sequence: int8
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 5314782
num_examples: 37863
- name: test
num_bytes: 1979650
num_examples: 14129
- name: validation
num_bytes: 2613902
num_examples: 18649
download_size: 2205306
dataset_size: 9908334
---
# Dataset Card for "english_char_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emrgnt-cmplxty/textbook-instruct-full | 2023-10-10T23:27:47.000Z | [
"region:us"
] | emrgnt-cmplxty | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: formatted_prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 1973942
num_examples: 71
download_size: 0
dataset_size: 1973942
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "textbook-instruct-full"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
growth-cadet/news_to_json02 | 2023-10-10T23:01:34.000Z | [
"region:us"
] | growth-cadet | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: context
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 6618664
num_examples: 1338
download_size: 3194272
dataset_size: 6618664
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "news_to_json02"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kejian/odmeeting_oracle_govrep_format | 2023-10-10T23:07:56.000Z | [
"region:us"
] | kejian | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: int64
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 25751717
num_examples: 261
- name: test
num_bytes: 13445910
num_examples: 131
- name: validation
num_bytes: 4116222
num_examples: 44
download_size: 21987705
dataset_size: 43313849
---
# Dataset Card for "odmeeting_oracle_govrep_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B | 2023-10-10T23:10:36.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/ANIMA-Phi-Neptune-Mistral-7B](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-10-10T23:09:12.843992](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B/blob/main/results_2023-10-10T23-09-12.843992.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5294074045529835,\n\
\ \"acc_stderr\": 0.03498173594538078,\n \"acc_norm\": 0.5333993126870243,\n\
\ \"acc_norm_stderr\": 0.03496838460772866,\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.01725565750290304,\n \"mc2\": 0.5976206456012524,\n\
\ \"mc2_stderr\": 0.015001808132807819\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.514505119453925,\n \"acc_stderr\": 0.014605241081370056,\n\
\ \"acc_norm\": 0.5597269624573379,\n \"acc_norm_stderr\": 0.014506769524804236\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5718980282812188,\n\
\ \"acc_stderr\": 0.004937924326742568,\n \"acc_norm\": 0.7621987651862179,\n\
\ \"acc_norm_stderr\": 0.004248666961833351\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249033,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249033\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6037735849056604,\n \"acc_stderr\": 0.030102793781791197,\n\
\ \"acc_norm\": 0.6037735849056604,\n \"acc_norm_stderr\": 0.030102793781791197\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110175,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110175\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006716,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006716\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35978835978835977,\n \"acc_stderr\": 0.024718075944129277,\n \"\
acc_norm\": 0.35978835978835977,\n \"acc_norm_stderr\": 0.024718075944129277\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0436031486007746,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0436031486007746\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5870967741935483,\n \"acc_stderr\": 0.028009138125400387,\n \"\
acc_norm\": 0.5870967741935483,\n \"acc_norm_stderr\": 0.028009138125400387\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n \"\
acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\"\
: 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.7253886010362695,\n \"acc_stderr\": 0.03221024508041152,\n\
\ \"acc_norm\": 0.7253886010362695,\n \"acc_norm_stderr\": 0.03221024508041152\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n\
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4831932773109244,\n \"acc_stderr\": 0.03246013680375308,\n \
\ \"acc_norm\": 0.4831932773109244,\n \"acc_norm_stderr\": 0.03246013680375308\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7119266055045872,\n\
\ \"acc_stderr\": 0.01941644589263603,\n \"acc_norm\": 0.7119266055045872,\n\
\ \"acc_norm_stderr\": 0.01941644589263603\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n\
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7156862745098039,\n \"acc_stderr\": 0.03166009679399812,\n \"\
acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.03166009679399812\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422882,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422882\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6030534351145038,\n \"acc_stderr\": 0.04291135671009224,\n\
\ \"acc_norm\": 0.6030534351145038,\n \"acc_norm_stderr\": 0.04291135671009224\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n\
\ \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n\
\ \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.038142698932618374,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.038142698932618374\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033543,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7062579821200511,\n\
\ \"acc_stderr\": 0.016287759388491665,\n \"acc_norm\": 0.7062579821200511,\n\
\ \"acc_norm_stderr\": 0.016287759388491665\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5606936416184971,\n \"acc_stderr\": 0.026720034380514995,\n\
\ \"acc_norm\": 0.5606936416184971,\n \"acc_norm_stderr\": 0.026720034380514995\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.01541449448790321,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.01541449448790321\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626585,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626585\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6049382716049383,\n \"acc_stderr\": 0.027201117666925657,\n\
\ \"acc_norm\": 0.6049382716049383,\n \"acc_norm_stderr\": 0.027201117666925657\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.02889395541211588,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.02889395541211588\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3878748370273794,\n\
\ \"acc_stderr\": 0.012444998309675617,\n \"acc_norm\": 0.3878748370273794,\n\
\ \"acc_norm_stderr\": 0.012444998309675617\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.46691176470588236,\n \"acc_stderr\": 0.030306257722468324,\n\
\ \"acc_norm\": 0.46691176470588236,\n \"acc_norm_stderr\": 0.030306257722468324\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5049019607843137,\n \"acc_stderr\": 0.02022686271003946,\n \
\ \"acc_norm\": 0.5049019607843137,\n \"acc_norm_stderr\": 0.02022686271003946\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.030555316755573637,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.030555316755573637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.034462962170884265,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.034462962170884265\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.01725565750290304,\n \"mc2\": 0.5976206456012524,\n\
\ \"mc2_stderr\": 0.015001808132807819\n }\n}\n```"
repo_url: https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|arc:challenge|25_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hellaswag|10_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T23-09-12.843992.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T23-09-12.843992.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T23-09-12.843992.parquet'
- config_name: results
data_files:
- split: 2023_10_10T23_09_12.843992
path:
- results_2023-10-10T23-09-12.843992.parquet
- split: latest
path:
- results_2023-10-10T23-09-12.843992.parquet
---
# Dataset Card for Evaluation run of Severian/ANIMA-Phi-Neptune-Mistral-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Severian/ANIMA-Phi-Neptune-Mistral-7B](https://huggingface.co/Severian/ANIMA-Phi-Neptune-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-10T23:09:12.843992](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__ANIMA-Phi-Neptune-Mistral-7B/blob/main/results_2023-10-10T23-09-12.843992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5294074045529835,
"acc_stderr": 0.03498173594538078,
"acc_norm": 0.5333993126870243,
"acc_norm_stderr": 0.03496838460772866,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.01725565750290304,
"mc2": 0.5976206456012524,
"mc2_stderr": 0.015001808132807819
},
"harness|arc:challenge|25": {
"acc": 0.514505119453925,
"acc_stderr": 0.014605241081370056,
"acc_norm": 0.5597269624573379,
"acc_norm_stderr": 0.014506769524804236
},
"harness|hellaswag|10": {
"acc": 0.5718980282812188,
"acc_stderr": 0.004937924326742568,
"acc_norm": 0.7621987651862179,
"acc_norm_stderr": 0.004248666961833351
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249033,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249033
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6037735849056604,
"acc_stderr": 0.030102793781791197,
"acc_norm": 0.6037735849056604,
"acc_norm_stderr": 0.030102793781791197
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110175,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110175
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697031,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697031
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006716,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006716
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35978835978835977,
"acc_stderr": 0.024718075944129277,
"acc_norm": 0.35978835978835977,
"acc_norm_stderr": 0.024718075944129277
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0436031486007746,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0436031486007746
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5870967741935483,
"acc_stderr": 0.028009138125400387,
"acc_norm": 0.5870967741935483,
"acc_norm_stderr": 0.028009138125400387
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7253886010362695,
"acc_stderr": 0.03221024508041152,
"acc_norm": 0.7253886010362695,
"acc_norm_stderr": 0.03221024508041152
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4831932773109244,
"acc_stderr": 0.03246013680375308,
"acc_norm": 0.4831932773109244,
"acc_norm_stderr": 0.03246013680375308
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7119266055045872,
"acc_stderr": 0.01941644589263603,
"acc_norm": 0.7119266055045872,
"acc_norm_stderr": 0.01941644589263603
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.03166009679399812,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.03166009679399812
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422882,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422882
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6030534351145038,
"acc_stderr": 0.04291135671009224,
"acc_norm": 0.6030534351145038,
"acc_norm_stderr": 0.04291135671009224
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.038142698932618374,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.038142698932618374
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033543,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7062579821200511,
"acc_stderr": 0.016287759388491665,
"acc_norm": 0.7062579821200511,
"acc_norm_stderr": 0.016287759388491665
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.026720034380514995,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.026720034380514995
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.01541449448790321,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.01541449448790321
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.028509807802626585,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.028509807802626585
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6049382716049383,
"acc_stderr": 0.027201117666925657,
"acc_norm": 0.6049382716049383,
"acc_norm_stderr": 0.027201117666925657
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.02889395541211588,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.02889395541211588
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3878748370273794,
"acc_stderr": 0.012444998309675617,
"acc_norm": 0.3878748370273794,
"acc_norm_stderr": 0.012444998309675617
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.46691176470588236,
"acc_stderr": 0.030306257722468324,
"acc_norm": 0.46691176470588236,
"acc_norm_stderr": 0.030306257722468324
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5049019607843137,
"acc_stderr": 0.02022686271003946,
"acc_norm": 0.5049019607843137,
"acc_norm_stderr": 0.02022686271003946
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.030555316755573637,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.030555316755573637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.034462962170884265,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.034462962170884265
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.01725565750290304,
"mc2": 0.5976206456012524,
"mc2_stderr": 0.015001808132807819
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
toninhodjj/toninhodjdataset | 2023-10-10T23:15:17.000Z | [
"region:us"
] | toninhodjj | null | null | null | 0 | 0 | Entry not found |
tomashs/LSC_acronyms_topic_vectors_128 | 2023-10-10T23:26:07.000Z | [
"region:us"
] | tomashs | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: short_form
dtype: string
- name: long_form
dtype: string
- name: label
dtype: int64
- name: topic_vector
sequence: float64
splits:
- name: train
num_bytes: 469862809
num_examples: 352720
- name: validation
num_bytes: 100339691
num_examples: 75339
- name: test
num_bytes: 100732958
num_examples: 75540
download_size: 604818064
dataset_size: 670935458
---
# Dataset Card for "LSC_acronyms_topic_vectors_128"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Brecon/Auto_Set | 2023-10-10T23:28:24.000Z | [
"region:us"
] | Brecon | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 342418.2468619247
num_examples: 382
- name: test
num_bytes: 86052.75313807532
num_examples: 96
download_size: 263920
dataset_size: 428471.0
---
# Dataset Card for "Auto_Set"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gorkemsevinc/turkishDataSet | 2023-10-10T23:32:56.000Z | [
"region:us"
] | gorkemsevinc | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 134596742.14721414
num_examples: 362520
- name: validation
num_bytes: 14955564.852785867
num_examples: 40281
download_size: 95516965
dataset_size: 149552307.0
---
# Dataset Card for "turkishDataSet"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hahminlew/kream-product-blip-captions | 2023-10-11T00:41:06.000Z | [
"task_categories:text-to-image",
"size_categories:10K<n<100K",
"language:en",
"license:cc-by-nc-sa-4.0",
"fashion",
"cloth",
"computer-vision",
"region:us"
] | hahminlew | null | null | null | 0 | 0 | ---
license: cc-by-nc-sa-4.0
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1363424468
num_examples: 14904
download_size: 1328309729
dataset_size: 1363424468
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- text-to-image
language:
- en
size_categories:
- 10K<n<100K
images_reference:
- KREAM (https://kream.co.kr/)
pretty_name: KREAM Product Blip Capitions
tags:
- fashion
- cloth
- computer-vision
---
## KREAM Product Blip Captions Dataset Information
**KREAM Product Blip Captions Dataset** is a dataset card for finetuning a text-to-image generative model collected from [KREAM](https://kream.co.kr/), one of the best online-resell market in Korea.
Have fun creating realistic, high-quality fashion items!
This dataset consists of 'image' and 'text' key pairs.
The format of 'text' is 'category (e.g. outer), product original name (e.g. The North Face 1996 Eco Nuptse Jacket Black), blip captions (e.g. a photography of the north face black down jacket)'.
You can easily construct this dataset and finetune stable diffusion from scratch using [easy-finetuning-stable-diffusion](https://github.com/hahminlew/easy-finetuning-stable-diffusion).
## Citation
If you use this dataset, please cite it as:
```
@misc{lew2023kream,
author = {Lew, Hah Min},
title = {KREAM Product BLIP Captions},
year={2023},
howpublished= {\url{https://huggingface.co/datasets/hahminlew/kream-product-blip-captions/}}
}
``` |
cadaeic/2242_samples_synthesized_recipe_squad_dataset | 2023-10-10T23:48:47.000Z | [
"region:us"
] | cadaeic | null | null | null | 0 | 0 | Entry not found |
maxColten/TK | 2023-10-10T23:47:40.000Z | [
"region:us"
] | maxColten | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.